Dec 11 00:08:47 np0005554845 kernel: Linux version 5.14.0-648.el9.x86_64 (mockbuild@x86-05.stream.rdu2.redhat.com) (gcc (GCC) 11.5.0 20240719 (Red Hat 11.5.0-14), GNU ld version 2.35.2-69.el9) #1 SMP PREEMPT_DYNAMIC Fri Dec 5 11:18:23 UTC 2025
Dec 11 00:08:47 np0005554845 kernel: The list of certified hardware and cloud instances for Red Hat Enterprise Linux 9 can be viewed at the Red Hat Ecosystem Catalog, https://catalog.redhat.com.
Dec 11 00:08:47 np0005554845 kernel: Command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 00:08:47 np0005554845 kernel: BIOS-provided physical RAM map:
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x0000000000000000-0x000000000009fbff] usable
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x000000000009fc00-0x000000000009ffff] reserved
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x00000000000f0000-0x00000000000fffff] reserved
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x0000000000100000-0x00000000bffdafff] usable
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x00000000bffdb000-0x00000000bfffffff] reserved
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x00000000feffc000-0x00000000feffffff] reserved
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x00000000fffc0000-0x00000000ffffffff] reserved
Dec 11 00:08:47 np0005554845 kernel: BIOS-e820: [mem 0x0000000100000000-0x000000023fffffff] usable
Dec 11 00:08:47 np0005554845 kernel: NX (Execute Disable) protection: active
Dec 11 00:08:47 np0005554845 kernel: APIC: Static calls initialized
Dec 11 00:08:47 np0005554845 kernel: SMBIOS 2.8 present.
Dec 11 00:08:47 np0005554845 kernel: DMI: OpenStack Foundation OpenStack Nova, BIOS 1.15.0-1 04/01/2014
Dec 11 00:08:47 np0005554845 kernel: Hypervisor detected: KVM
Dec 11 00:08:47 np0005554845 kernel: kvm-clock: Using msrs 4b564d01 and 4b564d00
Dec 11 00:08:47 np0005554845 kernel: kvm-clock: using sched offset of 3320336339 cycles
Dec 11 00:08:47 np0005554845 kernel: clocksource: kvm-clock: mask: 0xffffffffffffffff max_cycles: 0x1cd42e4dffb, max_idle_ns: 881590591483 ns
Dec 11 00:08:47 np0005554845 kernel: tsc: Detected 2799.998 MHz processor
Dec 11 00:08:47 np0005554845 kernel: last_pfn = 0x240000 max_arch_pfn = 0x400000000
Dec 11 00:08:47 np0005554845 kernel: MTRR map: 4 entries (3 fixed + 1 variable; max 19), built from 8 variable MTRRs
Dec 11 00:08:47 np0005554845 kernel: x86/PAT: Configuration [0-7]: WB  WC  UC- UC  WB  WP  UC- WT  
Dec 11 00:08:47 np0005554845 kernel: last_pfn = 0xbffdb max_arch_pfn = 0x400000000
Dec 11 00:08:47 np0005554845 kernel: found SMP MP-table at [mem 0x000f5ae0-0x000f5aef]
Dec 11 00:08:47 np0005554845 kernel: Using GB pages for direct mapping
Dec 11 00:08:47 np0005554845 kernel: RAMDISK: [mem 0x2d46a000-0x32a2cfff]
Dec 11 00:08:47 np0005554845 kernel: ACPI: Early table checksum verification disabled
Dec 11 00:08:47 np0005554845 kernel: ACPI: RSDP 0x00000000000F5AA0 000014 (v00 BOCHS )
Dec 11 00:08:47 np0005554845 kernel: ACPI: RSDT 0x00000000BFFE16BD 000030 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 00:08:47 np0005554845 kernel: ACPI: FACP 0x00000000BFFE1571 000074 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 00:08:47 np0005554845 kernel: ACPI: DSDT 0x00000000BFFDFC80 0018F1 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 00:08:47 np0005554845 kernel: ACPI: FACS 0x00000000BFFDFC40 000040
Dec 11 00:08:47 np0005554845 kernel: ACPI: APIC 0x00000000BFFE15E5 0000B0 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 00:08:47 np0005554845 kernel: ACPI: WAET 0x00000000BFFE1695 000028 (v01 BOCHS  BXPC     00000001 BXPC 00000001)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Reserving FACP table memory at [mem 0xbffe1571-0xbffe15e4]
Dec 11 00:08:47 np0005554845 kernel: ACPI: Reserving DSDT table memory at [mem 0xbffdfc80-0xbffe1570]
Dec 11 00:08:47 np0005554845 kernel: ACPI: Reserving FACS table memory at [mem 0xbffdfc40-0xbffdfc7f]
Dec 11 00:08:47 np0005554845 kernel: ACPI: Reserving APIC table memory at [mem 0xbffe15e5-0xbffe1694]
Dec 11 00:08:47 np0005554845 kernel: ACPI: Reserving WAET table memory at [mem 0xbffe1695-0xbffe16bc]
Dec 11 00:08:47 np0005554845 kernel: No NUMA configuration found
Dec 11 00:08:47 np0005554845 kernel: Faking a node at [mem 0x0000000000000000-0x000000023fffffff]
Dec 11 00:08:47 np0005554845 kernel: NODE_DATA(0) allocated [mem 0x23ffd5000-0x23fffffff]
Dec 11 00:08:47 np0005554845 kernel: crashkernel reserved: 0x00000000af000000 - 0x00000000bf000000 (256 MB)
Dec 11 00:08:47 np0005554845 kernel: Zone ranges:
Dec 11 00:08:47 np0005554845 kernel:  DMA      [mem 0x0000000000001000-0x0000000000ffffff]
Dec 11 00:08:47 np0005554845 kernel:  DMA32    [mem 0x0000000001000000-0x00000000ffffffff]
Dec 11 00:08:47 np0005554845 kernel:  Normal   [mem 0x0000000100000000-0x000000023fffffff]
Dec 11 00:08:47 np0005554845 kernel:  Device   empty
Dec 11 00:08:47 np0005554845 kernel: Movable zone start for each node
Dec 11 00:08:47 np0005554845 kernel: Early memory node ranges
Dec 11 00:08:47 np0005554845 kernel:  node   0: [mem 0x0000000000001000-0x000000000009efff]
Dec 11 00:08:47 np0005554845 kernel:  node   0: [mem 0x0000000000100000-0x00000000bffdafff]
Dec 11 00:08:47 np0005554845 kernel:  node   0: [mem 0x0000000100000000-0x000000023fffffff]
Dec 11 00:08:47 np0005554845 kernel: Initmem setup node 0 [mem 0x0000000000001000-0x000000023fffffff]
Dec 11 00:08:47 np0005554845 kernel: On node 0, zone DMA: 1 pages in unavailable ranges
Dec 11 00:08:47 np0005554845 kernel: On node 0, zone DMA: 97 pages in unavailable ranges
Dec 11 00:08:47 np0005554845 kernel: On node 0, zone Normal: 37 pages in unavailable ranges
Dec 11 00:08:47 np0005554845 kernel: ACPI: PM-Timer IO Port: 0x608
Dec 11 00:08:47 np0005554845 kernel: ACPI: LAPIC_NMI (acpi_id[0xff] dfl dfl lint[0x1])
Dec 11 00:08:47 np0005554845 kernel: IOAPIC[0]: apic_id 0, version 17, address 0xfec00000, GSI 0-23
Dec 11 00:08:47 np0005554845 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 0 global_irq 2 dfl dfl)
Dec 11 00:08:47 np0005554845 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 5 global_irq 5 high level)
Dec 11 00:08:47 np0005554845 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 9 global_irq 9 high level)
Dec 11 00:08:47 np0005554845 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 10 global_irq 10 high level)
Dec 11 00:08:47 np0005554845 kernel: ACPI: INT_SRC_OVR (bus 0 bus_irq 11 global_irq 11 high level)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Using ACPI (MADT) for SMP configuration information
Dec 11 00:08:47 np0005554845 kernel: TSC deadline timer available
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Max. logical packages:   8
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Max. logical dies:       8
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Max. dies per package:   1
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Max. threads per core:   1
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Num. cores per package:     1
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Num. threads per package:   1
Dec 11 00:08:47 np0005554845 kernel: CPU topo: Allowing 8 present CPUs plus 0 hotplug CPUs
Dec 11 00:08:47 np0005554845 kernel: kvm-guest: APIC: eoi() replaced with kvm_guest_apic_eoi_write()
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0x00000000-0x00000fff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0x0009f000-0x0009ffff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0x000a0000-0x000effff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0x000f0000-0x000fffff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0xbffdb000-0xbfffffff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0xc0000000-0xfeffbfff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0xfeffc000-0xfeffffff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0xff000000-0xfffbffff]
Dec 11 00:08:47 np0005554845 kernel: PM: hibernation: Registered nosave memory: [mem 0xfffc0000-0xffffffff]
Dec 11 00:08:47 np0005554845 kernel: [mem 0xc0000000-0xfeffbfff] available for PCI devices
Dec 11 00:08:47 np0005554845 kernel: Booting paravirtualized kernel on KVM
Dec 11 00:08:47 np0005554845 kernel: clocksource: refined-jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1910969940391419 ns
Dec 11 00:08:47 np0005554845 kernel: setup_percpu: NR_CPUS:8192 nr_cpumask_bits:8 nr_cpu_ids:8 nr_node_ids:1
Dec 11 00:08:47 np0005554845 kernel: percpu: Embedded 64 pages/cpu s225280 r8192 d28672 u262144
Dec 11 00:08:47 np0005554845 kernel: kvm-guest: PV spinlocks disabled, no host support
Dec 11 00:08:47 np0005554845 kernel: Kernel command line: BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 00:08:47 np0005554845 kernel: Unknown kernel command line parameters "BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64", will be passed to user space.
Dec 11 00:08:47 np0005554845 kernel: random: crng init done
Dec 11 00:08:47 np0005554845 kernel: Dentry cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: Inode-cache hash table entries: 524288 (order: 10, 4194304 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: Fallback order for Node 0: 0 
Dec 11 00:08:47 np0005554845 kernel: Built 1 zonelists, mobility grouping on.  Total pages: 2064091
Dec 11 00:08:47 np0005554845 kernel: Policy zone: Normal
Dec 11 00:08:47 np0005554845 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off
Dec 11 00:08:47 np0005554845 kernel: software IO TLB: area num 8.
Dec 11 00:08:47 np0005554845 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=8, Nodes=1
Dec 11 00:08:47 np0005554845 kernel: ftrace: allocating 49357 entries in 193 pages
Dec 11 00:08:47 np0005554845 kernel: ftrace: allocated 193 pages with 3 groups
Dec 11 00:08:47 np0005554845 kernel: Dynamic Preempt: voluntary
Dec 11 00:08:47 np0005554845 kernel: rcu: Preemptible hierarchical RCU implementation.
Dec 11 00:08:47 np0005554845 kernel: rcu: #011RCU event tracing is enabled.
Dec 11 00:08:47 np0005554845 kernel: rcu: #011RCU restricting CPUs from NR_CPUS=8192 to nr_cpu_ids=8.
Dec 11 00:08:47 np0005554845 kernel: #011Trampoline variant of Tasks RCU enabled.
Dec 11 00:08:47 np0005554845 kernel: #011Rude variant of Tasks RCU enabled.
Dec 11 00:08:47 np0005554845 kernel: #011Tracing variant of Tasks RCU enabled.
Dec 11 00:08:47 np0005554845 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies.
Dec 11 00:08:47 np0005554845 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=8
Dec 11 00:08:47 np0005554845 kernel: RCU Tasks: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 00:08:47 np0005554845 kernel: RCU Tasks Rude: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 00:08:47 np0005554845 kernel: RCU Tasks Trace: Setting shift to 3 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=8.
Dec 11 00:08:47 np0005554845 kernel: NR_IRQS: 524544, nr_irqs: 488, preallocated irqs: 16
Dec 11 00:08:47 np0005554845 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention.
Dec 11 00:08:47 np0005554845 kernel: kfence: initialized - using 2097152 bytes for 255 objects at 0x(____ptrval____)-0x(____ptrval____)
Dec 11 00:08:47 np0005554845 kernel: Console: colour VGA+ 80x25
Dec 11 00:08:47 np0005554845 kernel: printk: console [ttyS0] enabled
Dec 11 00:08:47 np0005554845 kernel: ACPI: Core revision 20230331
Dec 11 00:08:47 np0005554845 kernel: APIC: Switch to symmetric I/O mode setup
Dec 11 00:08:47 np0005554845 kernel: x2apic enabled
Dec 11 00:08:47 np0005554845 kernel: APIC: Switched APIC routing to: physical x2apic
Dec 11 00:08:47 np0005554845 kernel: tsc: Marking TSC unstable due to TSCs unsynchronized
Dec 11 00:08:47 np0005554845 kernel: Calibrating delay loop (skipped) preset value.. 5599.99 BogoMIPS (lpj=2799998)
Dec 11 00:08:47 np0005554845 kernel: x86/cpu: User Mode Instruction Prevention (UMIP) activated
Dec 11 00:08:47 np0005554845 kernel: Last level iTLB entries: 4KB 512, 2MB 255, 4MB 127
Dec 11 00:08:47 np0005554845 kernel: Last level dTLB entries: 4KB 512, 2MB 255, 4MB 127, 1GB 0
Dec 11 00:08:47 np0005554845 kernel: Spectre V1 : Mitigation: usercopy/swapgs barriers and __user pointer sanitization
Dec 11 00:08:47 np0005554845 kernel: Spectre V2 : Mitigation: Retpolines
Dec 11 00:08:47 np0005554845 kernel: Spectre V2 : Spectre v2 / SpectreRSB: Filling RSB on context switch and VMEXIT
Dec 11 00:08:47 np0005554845 kernel: Spectre V2 : Enabling Speculation Barrier for firmware calls
Dec 11 00:08:47 np0005554845 kernel: RETBleed: Mitigation: untrained return thunk
Dec 11 00:08:47 np0005554845 kernel: Spectre V2 : mitigation: Enabling conditional Indirect Branch Prediction Barrier
Dec 11 00:08:47 np0005554845 kernel: Speculative Store Bypass: Mitigation: Speculative Store Bypass disabled via prctl
Dec 11 00:08:47 np0005554845 kernel: Speculative Return Stack Overflow: IBPB-extending microcode not applied!
Dec 11 00:08:47 np0005554845 kernel: Speculative Return Stack Overflow: WARNING: See https://kernel.org/doc/html/latest/admin-guide/hw-vuln/srso.html for mitigation options.
Dec 11 00:08:47 np0005554845 kernel: x86/bugs: return thunk changed
Dec 11 00:08:47 np0005554845 kernel: Speculative Return Stack Overflow: Vulnerable: Safe RET, no microcode
Dec 11 00:08:47 np0005554845 kernel: x86/fpu: Supporting XSAVE feature 0x001: 'x87 floating point registers'
Dec 11 00:08:47 np0005554845 kernel: x86/fpu: Supporting XSAVE feature 0x002: 'SSE registers'
Dec 11 00:08:47 np0005554845 kernel: x86/fpu: Supporting XSAVE feature 0x004: 'AVX registers'
Dec 11 00:08:47 np0005554845 kernel: x86/fpu: xstate_offset[2]:  576, xstate_sizes[2]:  256
Dec 11 00:08:47 np0005554845 kernel: x86/fpu: Enabled xstate features 0x7, context size is 832 bytes, using 'compacted' format.
Dec 11 00:08:47 np0005554845 kernel: Freeing SMP alternatives memory: 40K
Dec 11 00:08:47 np0005554845 kernel: pid_max: default: 32768 minimum: 301
Dec 11 00:08:47 np0005554845 kernel: LSM: initializing lsm=lockdown,capability,landlock,yama,integrity,selinux,bpf
Dec 11 00:08:47 np0005554845 kernel: landlock: Up and running.
Dec 11 00:08:47 np0005554845 kernel: Yama: becoming mindful.
Dec 11 00:08:47 np0005554845 kernel: SELinux:  Initializing.
Dec 11 00:08:47 np0005554845 kernel: LSM support for eBPF active
Dec 11 00:08:47 np0005554845 kernel: Mount-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: Mountpoint-cache hash table entries: 16384 (order: 5, 131072 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: smpboot: CPU0: AMD EPYC-Rome Processor (family: 0x17, model: 0x31, stepping: 0x0)
Dec 11 00:08:47 np0005554845 kernel: Performance Events: Fam17h+ core perfctr, AMD PMU driver.
Dec 11 00:08:47 np0005554845 kernel: ... version:                0
Dec 11 00:08:47 np0005554845 kernel: ... bit width:              48
Dec 11 00:08:47 np0005554845 kernel: ... generic registers:      6
Dec 11 00:08:47 np0005554845 kernel: ... value mask:             0000ffffffffffff
Dec 11 00:08:47 np0005554845 kernel: ... max period:             00007fffffffffff
Dec 11 00:08:47 np0005554845 kernel: ... fixed-purpose events:   0
Dec 11 00:08:47 np0005554845 kernel: ... event mask:             000000000000003f
Dec 11 00:08:47 np0005554845 kernel: signal: max sigframe size: 1776
Dec 11 00:08:47 np0005554845 kernel: rcu: Hierarchical SRCU implementation.
Dec 11 00:08:47 np0005554845 kernel: rcu: #011Max phase no-delay instances is 400.
Dec 11 00:08:47 np0005554845 kernel: smp: Bringing up secondary CPUs ...
Dec 11 00:08:47 np0005554845 kernel: smpboot: x86: Booting SMP configuration:
Dec 11 00:08:47 np0005554845 kernel: .... node  #0, CPUs:      #1 #2 #3 #4 #5 #6 #7
Dec 11 00:08:47 np0005554845 kernel: smp: Brought up 1 node, 8 CPUs
Dec 11 00:08:47 np0005554845 kernel: smpboot: Total of 8 processors activated (44799.96 BogoMIPS)
Dec 11 00:08:47 np0005554845 kernel: node 0 deferred pages initialised in 9ms
Dec 11 00:08:47 np0005554845 kernel: Memory: 7763864K/8388068K available (16384K kernel code, 5795K rwdata, 13916K rodata, 4192K init, 7164K bss, 618220K reserved, 0K cma-reserved)
Dec 11 00:08:47 np0005554845 kernel: devtmpfs: initialized
Dec 11 00:08:47 np0005554845 kernel: x86/mm: Memory block size: 128MB
Dec 11 00:08:47 np0005554845 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns
Dec 11 00:08:47 np0005554845 kernel: futex hash table entries: 2048 (131072 bytes on 1 NUMA nodes, total 128 KiB, linear).
Dec 11 00:08:47 np0005554845 kernel: pinctrl core: initialized pinctrl subsystem
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family
Dec 11 00:08:47 np0005554845 kernel: DMA: preallocated 1024 KiB GFP_KERNEL pool for atomic allocations
Dec 11 00:08:47 np0005554845 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations
Dec 11 00:08:47 np0005554845 kernel: DMA: preallocated 1024 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations
Dec 11 00:08:47 np0005554845 kernel: audit: initializing netlink subsys (disabled)
Dec 11 00:08:47 np0005554845 kernel: audit: type=2000 audit(1765429725.688:1): state=initialized audit_enabled=0 res=1
Dec 11 00:08:47 np0005554845 kernel: thermal_sys: Registered thermal governor 'fair_share'
Dec 11 00:08:47 np0005554845 kernel: thermal_sys: Registered thermal governor 'step_wise'
Dec 11 00:08:47 np0005554845 kernel: thermal_sys: Registered thermal governor 'user_space'
Dec 11 00:08:47 np0005554845 kernel: cpuidle: using governor menu
Dec 11 00:08:47 np0005554845 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5
Dec 11 00:08:47 np0005554845 kernel: PCI: Using configuration type 1 for base access
Dec 11 00:08:47 np0005554845 kernel: PCI: Using configuration type 1 for extended access
Dec 11 00:08:47 np0005554845 kernel: kprobes: kprobe jump-optimization is enabled. All kprobes are optimized if possible.
Dec 11 00:08:47 np0005554845 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages
Dec 11 00:08:47 np0005554845 kernel: HugeTLB: 16380 KiB vmemmap can be freed for a 1.00 GiB page
Dec 11 00:08:47 np0005554845 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages
Dec 11 00:08:47 np0005554845 kernel: HugeTLB: 28 KiB vmemmap can be freed for a 2.00 MiB page
Dec 11 00:08:47 np0005554845 kernel: Demotion targets for Node 0: null
Dec 11 00:08:47 np0005554845 kernel: cryptd: max_cpu_qlen set to 1000
Dec 11 00:08:47 np0005554845 kernel: ACPI: Added _OSI(Module Device)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Added _OSI(Processor Device)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Added _OSI(3.0 _SCP Extensions)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Added _OSI(Processor Aggregator Device)
Dec 11 00:08:47 np0005554845 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded
Dec 11 00:08:47 np0005554845 kernel: ACPI: _OSC evaluation for CPUs failed, trying _PDC
Dec 11 00:08:47 np0005554845 kernel: ACPI: Interpreter enabled
Dec 11 00:08:47 np0005554845 kernel: ACPI: PM: (supports S0 S3 S4 S5)
Dec 11 00:08:47 np0005554845 kernel: ACPI: Using IOAPIC for interrupt routing
Dec 11 00:08:47 np0005554845 kernel: PCI: Using host bridge windows from ACPI; if necessary, use "pci=nocrs" and report a bug
Dec 11 00:08:47 np0005554845 kernel: PCI: Using E820 reservations for host bridge windows
Dec 11 00:08:47 np0005554845 kernel: ACPI: Enabled 2 GPEs in block 00 to 0F
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff])
Dec 11 00:08:47 np0005554845 kernel: acpi PNP0A03:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI EDR HPX-Type3]
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [3] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [4] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [5] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [6] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [7] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [8] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [9] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [10] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [11] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [12] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [13] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [14] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [15] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [16] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [17] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [18] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [19] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [20] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [21] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [22] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [23] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [24] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [25] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [26] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [27] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [28] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [29] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [30] registered
Dec 11 00:08:47 np0005554845 kernel: acpiphp: Slot [31] registered
Dec 11 00:08:47 np0005554845 kernel: PCI host bridge to bus 0000:00
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [io  0x0000-0x0cf7 window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [io  0x0d00-0xffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [mem 0x000a0000-0x000bffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [mem 0xc0000000-0xfebfffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [mem 0x240000000-0x2bfffffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: root bus resource [bus 00-ff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:00.0: [8086:1237] type 00 class 0x060000 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.0: [8086:7000] type 00 class 0x060100 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: [8086:7010] type 00 class 0x010180 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: BAR 4 [io  0xc140-0xc14f]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: BAR 0 [io  0x01f0-0x01f7]: legacy IDE quirk
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: BAR 1 [io  0x03f6]: legacy IDE quirk
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: BAR 2 [io  0x0170-0x0177]: legacy IDE quirk
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.1: BAR 3 [io  0x0376]: legacy IDE quirk
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.2: [8086:7020] type 00 class 0x0c0300 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.2: BAR 4 [io  0xc100-0xc11f]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.3: [8086:7113] type 00 class 0x068000 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.3: quirk: [io  0x0600-0x063f] claimed by PIIX4 ACPI
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.3: quirk: [io  0x0700-0x070f] claimed by PIIX4 SMB
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: [1af4:1050] type 00 class 0x030000 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: BAR 0 [mem 0xfe000000-0xfe7fffff pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: BAR 2 [mem 0xfe800000-0xfe803fff 64bit pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: BAR 4 [mem 0xfeb90000-0xfeb90fff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: ROM [mem 0xfeb80000-0xfeb8ffff pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: Video device with shadowed ROM at [mem 0x000c0000-0x000dffff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:03.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:03.0: BAR 0 [io  0xc080-0xc0bf]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:03.0: BAR 1 [mem 0xfeb91000-0xfeb91fff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:03.0: BAR 4 [mem 0xfe804000-0xfe807fff 64bit pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:03.0: ROM [mem 0xfeb00000-0xfeb7ffff pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:04.0: [1af4:1001] type 00 class 0x010000 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:04.0: BAR 0 [io  0xc000-0xc07f]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:04.0: BAR 1 [mem 0xfeb92000-0xfeb92fff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:04.0: BAR 4 [mem 0xfe808000-0xfe80bfff 64bit pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:05.0: [1af4:1002] type 00 class 0x00ff00 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:05.0: BAR 0 [io  0xc0c0-0xc0ff]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:05.0: BAR 4 [mem 0xfe80c000-0xfe80ffff 64bit pref]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:06.0: [1af4:1005] type 00 class 0x00ff00 conventional PCI endpoint
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:06.0: BAR 0 [io  0xc120-0xc13f]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:06.0: BAR 4 [mem 0xfe810000-0xfe813fff 64bit pref]
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI: Interrupt link LNKA configured for IRQ 10
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI: Interrupt link LNKB configured for IRQ 10
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI: Interrupt link LNKC configured for IRQ 11
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI: Interrupt link LNKD configured for IRQ 11
Dec 11 00:08:47 np0005554845 kernel: ACPI: PCI: Interrupt link LNKS configured for IRQ 9
Dec 11 00:08:47 np0005554845 kernel: iommu: Default domain type: Translated
Dec 11 00:08:47 np0005554845 kernel: iommu: DMA domain TLB invalidation policy: lazy mode
Dec 11 00:08:47 np0005554845 kernel: SCSI subsystem initialized
Dec 11 00:08:47 np0005554845 kernel: ACPI: bus type USB registered
Dec 11 00:08:47 np0005554845 kernel: usbcore: registered new interface driver usbfs
Dec 11 00:08:47 np0005554845 kernel: usbcore: registered new interface driver hub
Dec 11 00:08:47 np0005554845 kernel: usbcore: registered new device driver usb
Dec 11 00:08:47 np0005554845 kernel: pps_core: LinuxPPS API ver. 1 registered
Dec 11 00:08:47 np0005554845 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti <giometti@linux.it>
Dec 11 00:08:47 np0005554845 kernel: PTP clock support registered
Dec 11 00:08:47 np0005554845 kernel: EDAC MC: Ver: 3.0.0
Dec 11 00:08:47 np0005554845 kernel: NetLabel: Initializing
Dec 11 00:08:47 np0005554845 kernel: NetLabel:  domain hash size = 128
Dec 11 00:08:47 np0005554845 kernel: NetLabel:  protocols = UNLABELED CIPSOv4 CALIPSO
Dec 11 00:08:47 np0005554845 kernel: NetLabel:  unlabeled traffic allowed by default
Dec 11 00:08:47 np0005554845 kernel: PCI: Using ACPI for IRQ routing
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: vgaarb: setting as boot VGA device
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: vgaarb: bridge control possible
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:02.0: vgaarb: VGA device added: decodes=io+mem,owns=io+mem,locks=none
Dec 11 00:08:47 np0005554845 kernel: vgaarb: loaded
Dec 11 00:08:47 np0005554845 kernel: clocksource: Switched to clocksource kvm-clock
Dec 11 00:08:47 np0005554845 kernel: VFS: Disk quotas dquot_6.6.0
Dec 11 00:08:47 np0005554845 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes)
Dec 11 00:08:47 np0005554845 kernel: pnp: PnP ACPI init
Dec 11 00:08:47 np0005554845 kernel: pnp: PnP ACPI: found 5 devices
Dec 11 00:08:47 np0005554845 kernel: clocksource: acpi_pm: mask: 0xffffff max_cycles: 0xffffff, max_idle_ns: 2085701024 ns
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_INET protocol family
Dec 11 00:08:47 np0005554845 kernel: IP idents hash table entries: 131072 (order: 8, 1048576 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: tcp_listen_portaddr_hash hash table entries: 4096 (order: 4, 65536 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: TCP established hash table entries: 65536 (order: 7, 524288 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: TCP bind hash table entries: 65536 (order: 8, 1048576 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: TCP: Hash tables configured (established 65536 bind 65536)
Dec 11 00:08:47 np0005554845 kernel: MPTCP token hash table entries: 8192 (order: 5, 196608 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: UDP hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: UDP-Lite hash table entries: 4096 (order: 5, 131072 bytes, linear)
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_XDP protocol family
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: resource 4 [io  0x0000-0x0cf7 window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: resource 5 [io  0x0d00-0xffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: resource 6 [mem 0x000a0000-0x000bffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: resource 7 [mem 0xc0000000-0xfebfffff window]
Dec 11 00:08:47 np0005554845 kernel: pci_bus 0000:00: resource 8 [mem 0x240000000-0x2bfffffff window]
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.0: PIIX3: Enabling Passive Release
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:00.0: Limiting direct PCI/PCI transfers
Dec 11 00:08:47 np0005554845 kernel: ACPI: \_SB_.LNKD: Enabled at IRQ 11
Dec 11 00:08:47 np0005554845 kernel: pci 0000:00:01.2: quirk_usb_early_handoff+0x0/0x160 took 88320 usecs
Dec 11 00:08:47 np0005554845 kernel: PCI: CLS 0 bytes, default 64
Dec 11 00:08:47 np0005554845 kernel: PCI-DMA: Using software bounce buffering for IO (SWIOTLB)
Dec 11 00:08:47 np0005554845 kernel: software IO TLB: mapped [mem 0x00000000ab000000-0x00000000af000000] (64MB)
Dec 11 00:08:47 np0005554845 kernel: ACPI: bus type thunderbolt registered
Dec 11 00:08:47 np0005554845 kernel: Trying to unpack rootfs image as initramfs...
Dec 11 00:08:47 np0005554845 kernel: Initialise system trusted keyrings
Dec 11 00:08:47 np0005554845 kernel: Key type blacklist registered
Dec 11 00:08:47 np0005554845 kernel: workingset: timestamp_bits=36 max_order=21 bucket_order=0
Dec 11 00:08:47 np0005554845 kernel: zbud: loaded
Dec 11 00:08:47 np0005554845 kernel: integrity: Platform Keyring initialized
Dec 11 00:08:47 np0005554845 kernel: integrity: Machine keyring initialized
Dec 11 00:08:47 np0005554845 kernel: Freeing initrd memory: 87820K
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_ALG protocol family
Dec 11 00:08:47 np0005554845 kernel: xor: automatically using best checksumming function   avx       
Dec 11 00:08:47 np0005554845 kernel: Key type asymmetric registered
Dec 11 00:08:47 np0005554845 kernel: Asymmetric key parser 'x509' registered
Dec 11 00:08:47 np0005554845 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 246)
Dec 11 00:08:47 np0005554845 kernel: io scheduler mq-deadline registered
Dec 11 00:08:47 np0005554845 kernel: io scheduler kyber registered
Dec 11 00:08:47 np0005554845 kernel: io scheduler bfq registered
Dec 11 00:08:47 np0005554845 kernel: atomic64_test: passed for x86-64 platform with CX8 and with SSE
Dec 11 00:08:47 np0005554845 kernel: shpchp: Standard Hot Plug PCI Controller Driver version: 0.4
Dec 11 00:08:47 np0005554845 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0
Dec 11 00:08:47 np0005554845 kernel: ACPI: button: Power Button [PWRF]
Dec 11 00:08:47 np0005554845 kernel: ACPI: \_SB_.LNKB: Enabled at IRQ 10
Dec 11 00:08:47 np0005554845 kernel: ACPI: \_SB_.LNKC: Enabled at IRQ 11
Dec 11 00:08:47 np0005554845 kernel: ACPI: \_SB_.LNKA: Enabled at IRQ 10
Dec 11 00:08:47 np0005554845 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled
Dec 11 00:08:47 np0005554845 kernel: 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A
Dec 11 00:08:47 np0005554845 kernel: Non-volatile memory driver v1.3
Dec 11 00:08:47 np0005554845 kernel: rdac: device handler registered
Dec 11 00:08:47 np0005554845 kernel: hp_sw: device handler registered
Dec 11 00:08:47 np0005554845 kernel: emc: device handler registered
Dec 11 00:08:47 np0005554845 kernel: alua: device handler registered
Dec 11 00:08:47 np0005554845 kernel: uhci_hcd 0000:00:01.2: UHCI Host Controller
Dec 11 00:08:47 np0005554845 kernel: uhci_hcd 0000:00:01.2: new USB bus registered, assigned bus number 1
Dec 11 00:08:47 np0005554845 kernel: uhci_hcd 0000:00:01.2: detected 2 ports
Dec 11 00:08:47 np0005554845 kernel: uhci_hcd 0000:00:01.2: irq 11, io port 0x0000c100
Dec 11 00:08:47 np0005554845 kernel: usb usb1: New USB device found, idVendor=1d6b, idProduct=0001, bcdDevice= 5.14
Dec 11 00:08:47 np0005554845 kernel: usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1
Dec 11 00:08:47 np0005554845 kernel: usb usb1: Product: UHCI Host Controller
Dec 11 00:08:47 np0005554845 kernel: usb usb1: Manufacturer: Linux 5.14.0-648.el9.x86_64 uhci_hcd
Dec 11 00:08:47 np0005554845 kernel: usb usb1: SerialNumber: 0000:00:01.2
Dec 11 00:08:47 np0005554845 kernel: hub 1-0:1.0: USB hub found
Dec 11 00:08:47 np0005554845 kernel: hub 1-0:1.0: 2 ports detected
Dec 11 00:08:47 np0005554845 kernel: usbcore: registered new interface driver usbserial_generic
Dec 11 00:08:47 np0005554845 kernel: usbserial: USB Serial support registered for generic
Dec 11 00:08:47 np0005554845 kernel: i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12
Dec 11 00:08:47 np0005554845 kernel: serio: i8042 KBD port at 0x60,0x64 irq 1
Dec 11 00:08:47 np0005554845 kernel: serio: i8042 AUX port at 0x60,0x64 irq 12
Dec 11 00:08:47 np0005554845 kernel: mousedev: PS/2 mouse device common for all mice
Dec 11 00:08:47 np0005554845 kernel: rtc_cmos 00:04: RTC can wake from S4
Dec 11 00:08:47 np0005554845 kernel: rtc_cmos 00:04: registered as rtc0
Dec 11 00:08:47 np0005554845 kernel: rtc_cmos 00:04: setting system clock to 2025-12-11T05:08:46 UTC (1765429726)
Dec 11 00:08:47 np0005554845 kernel: rtc_cmos 00:04: alarms up to one day, y3k, 242 bytes nvram
Dec 11 00:08:47 np0005554845 kernel: input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1
Dec 11 00:08:47 np0005554845 kernel: amd_pstate: the _CPC object is not present in SBIOS or ACPI disabled
Dec 11 00:08:47 np0005554845 kernel: hid: raw HID events driver (C) Jiri Kosina
Dec 11 00:08:47 np0005554845 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4
Dec 11 00:08:47 np0005554845 kernel: usbcore: registered new interface driver usbhid
Dec 11 00:08:47 np0005554845 kernel: usbhid: USB HID core driver
Dec 11 00:08:47 np0005554845 kernel: drop_monitor: Initializing network drop monitor service
Dec 11 00:08:47 np0005554845 kernel: input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3
Dec 11 00:08:47 np0005554845 kernel: Initializing XFRM netlink socket
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_INET6 protocol family
Dec 11 00:08:47 np0005554845 kernel: Segment Routing with IPv6
Dec 11 00:08:47 np0005554845 kernel: NET: Registered PF_PACKET protocol family
Dec 11 00:08:47 np0005554845 kernel: mpls_gso: MPLS GSO support
Dec 11 00:08:47 np0005554845 kernel: IPI shorthand broadcast: enabled
Dec 11 00:08:47 np0005554845 kernel: AVX2 version of gcm_enc/dec engaged.
Dec 11 00:08:47 np0005554845 kernel: AES CTR mode by8 optimization enabled
Dec 11 00:08:47 np0005554845 kernel: sched_clock: Marking stable (1244003202, 150589777)->(1470872610, -76279631)
Dec 11 00:08:47 np0005554845 kernel: registered taskstats version 1
Dec 11 00:08:47 np0005554845 kernel: Loading compiled-in X.509 certificates
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux Driver Update Program (key 3): bf57f3e87362bc7229d9f465321773dfd1f77a80'
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'Red Hat Enterprise Linux kpatch signing key: 4d38fd864ebe18c5f0b72e3852e2014c3a676fc8'
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'RH-IMA-CA: Red Hat IMA CA: fb31825dd0e073685b264e3038963673f753959a'
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'Nvidia GPU OOT signing 001: 55e1cef88193e60419f0b0ec379c49f77545acf0'
Dec 11 00:08:47 np0005554845 kernel: Demotion targets for Node 0: null
Dec 11 00:08:47 np0005554845 kernel: page_owner is disabled
Dec 11 00:08:47 np0005554845 kernel: Key type .fscrypt registered
Dec 11 00:08:47 np0005554845 kernel: Key type fscrypt-provisioning registered
Dec 11 00:08:47 np0005554845 kernel: Key type big_key registered
Dec 11 00:08:47 np0005554845 kernel: Key type encrypted registered
Dec 11 00:08:47 np0005554845 kernel: ima: No TPM chip found, activating TPM-bypass!
Dec 11 00:08:47 np0005554845 kernel: Loading compiled-in module X.509 certificates
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'The CentOS Project: CentOS Stream kernel signing key: bcc7fcdcfd9be61e8634554e9f7a1c01f32489d8'
Dec 11 00:08:47 np0005554845 kernel: ima: Allocated hash algorithm: sha256
Dec 11 00:08:47 np0005554845 kernel: ima: No architecture policies found
Dec 11 00:08:47 np0005554845 kernel: evm: Initialising EVM extended attributes:
Dec 11 00:08:47 np0005554845 kernel: evm: security.selinux
Dec 11 00:08:47 np0005554845 kernel: evm: security.SMACK64 (disabled)
Dec 11 00:08:47 np0005554845 kernel: evm: security.SMACK64EXEC (disabled)
Dec 11 00:08:47 np0005554845 kernel: evm: security.SMACK64TRANSMUTE (disabled)
Dec 11 00:08:47 np0005554845 kernel: evm: security.SMACK64MMAP (disabled)
Dec 11 00:08:47 np0005554845 kernel: evm: security.apparmor (disabled)
Dec 11 00:08:47 np0005554845 kernel: evm: security.ima
Dec 11 00:08:47 np0005554845 kernel: evm: security.capability
Dec 11 00:08:47 np0005554845 kernel: evm: HMAC attrs: 0x1
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: new full-speed USB device number 2 using uhci_hcd
Dec 11 00:08:47 np0005554845 kernel: Running certificate verification RSA selftest
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'Certificate verification self-testing key: f58703bb33ce1b73ee02eccdee5b8817518fe3db'
Dec 11 00:08:47 np0005554845 kernel: Running certificate verification ECDSA selftest
Dec 11 00:08:47 np0005554845 kernel: Loaded X.509 cert 'Certificate verification ECDSA self-testing key: 2900bcea1deb7bc8479a84a23d758efdfdd2b2d3'
Dec 11 00:08:47 np0005554845 kernel: clk: Disabling unused clocks
Dec 11 00:08:47 np0005554845 kernel: Freeing unused decrypted memory: 2028K
Dec 11 00:08:47 np0005554845 kernel: Freeing unused kernel image (initmem) memory: 4192K
Dec 11 00:08:47 np0005554845 kernel: Write protecting the kernel read-only data: 30720k
Dec 11 00:08:47 np0005554845 kernel: Freeing unused kernel image (rodata/data gap) memory: 420K
Dec 11 00:08:47 np0005554845 kernel: x86/mm: Checked W+X mappings: passed, no W+X pages found.
Dec 11 00:08:47 np0005554845 kernel: Run /init as init process
Dec 11 00:08:47 np0005554845 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 11 00:08:47 np0005554845 systemd: Detected virtualization kvm.
Dec 11 00:08:47 np0005554845 systemd: Detected architecture x86-64.
Dec 11 00:08:47 np0005554845 systemd: Running in initrd.
Dec 11 00:08:47 np0005554845 systemd: No hostname configured, using default hostname.
Dec 11 00:08:47 np0005554845 systemd: Hostname set to <localhost>.
Dec 11 00:08:47 np0005554845 systemd: Initializing machine ID from VM UUID.
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: New USB device found, idVendor=0627, idProduct=0001, bcdDevice= 0.00
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: New USB device strings: Mfr=1, Product=3, SerialNumber=10
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: Product: QEMU USB Tablet
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: Manufacturer: QEMU
Dec 11 00:08:47 np0005554845 kernel: usb 1-1: SerialNumber: 28754-0000:00:01.2-1
Dec 11 00:08:47 np0005554845 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.2/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input5
Dec 11 00:08:47 np0005554845 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:00:01.2-1/input0
Dec 11 00:08:47 np0005554845 systemd: Queued start job for default target Initrd Default Target.
Dec 11 00:08:47 np0005554845 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec 11 00:08:47 np0005554845 systemd: Reached target Local Encrypted Volumes.
Dec 11 00:08:47 np0005554845 systemd: Reached target Initrd /usr File System.
Dec 11 00:08:47 np0005554845 systemd: Reached target Local File Systems.
Dec 11 00:08:47 np0005554845 systemd: Reached target Path Units.
Dec 11 00:08:47 np0005554845 systemd: Reached target Slice Units.
Dec 11 00:08:47 np0005554845 systemd: Reached target Swaps.
Dec 11 00:08:47 np0005554845 systemd: Reached target Timer Units.
Dec 11 00:08:47 np0005554845 systemd: Listening on D-Bus System Message Bus Socket.
Dec 11 00:08:47 np0005554845 systemd: Listening on Journal Socket (/dev/log).
Dec 11 00:08:47 np0005554845 systemd: Listening on Journal Socket.
Dec 11 00:08:47 np0005554845 systemd: Listening on udev Control Socket.
Dec 11 00:08:47 np0005554845 systemd: Listening on udev Kernel Socket.
Dec 11 00:08:47 np0005554845 systemd: Reached target Socket Units.
Dec 11 00:08:47 np0005554845 systemd: Starting Create List of Static Device Nodes...
Dec 11 00:08:47 np0005554845 systemd: Starting Journal Service...
Dec 11 00:08:47 np0005554845 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 11 00:08:47 np0005554845 systemd: Starting Apply Kernel Variables...
Dec 11 00:08:47 np0005554845 systemd: Starting Create System Users...
Dec 11 00:08:47 np0005554845 systemd: Starting Setup Virtual Console...
Dec 11 00:08:47 np0005554845 systemd: Finished Create List of Static Device Nodes.
Dec 11 00:08:47 np0005554845 systemd: Finished Apply Kernel Variables.
Dec 11 00:08:47 np0005554845 systemd: Finished Create System Users.
Dec 11 00:08:47 np0005554845 systemd-journald[305]: Journal started
Dec 11 00:08:47 np0005554845 systemd-journald[305]: Runtime Journal (/run/log/journal/a858e7f70b5046e3b37783982336f3bb) is 8.0M, max 153.6M, 145.6M free.
Dec 11 00:08:47 np0005554845 systemd-sysusers[309]: Creating group 'users' with GID 100.
Dec 11 00:08:47 np0005554845 systemd-sysusers[309]: Creating group 'dbus' with GID 81.
Dec 11 00:08:47 np0005554845 systemd-sysusers[309]: Creating user 'dbus' (System Message Bus) with UID 81 and GID 81.
Dec 11 00:08:47 np0005554845 systemd: Started Journal Service.
Dec 11 00:08:47 np0005554845 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 11 00:08:47 np0005554845 systemd[1]: Starting Create Volatile Files and Directories...
Dec 11 00:08:47 np0005554845 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 11 00:08:47 np0005554845 systemd[1]: Finished Create Volatile Files and Directories.
Dec 11 00:08:47 np0005554845 systemd[1]: Finished Setup Virtual Console.
Dec 11 00:08:47 np0005554845 systemd[1]: dracut ask for additional cmdline parameters was skipped because no trigger condition checks were met.
Dec 11 00:08:47 np0005554845 systemd[1]: Starting dracut cmdline hook...
Dec 11 00:08:47 np0005554845 dracut-cmdline[323]: dracut-9 dracut-057-102.git20250818.el9
Dec 11 00:08:47 np0005554845 dracut-cmdline[323]: Using kernel command line parameters:    BOOT_IMAGE=(hd0,msdos1)/boot/vmlinuz-5.14.0-648.el9.x86_64 root=UUID=cbdedf45-ed1d-4952-82a8-33a12c0ba266 ro console=ttyS0,115200n8 no_timer_check net.ifnames=0 crashkernel=1G-2G:192M,2G-64G:256M,64G-:512M
Dec 11 00:08:47 np0005554845 systemd[1]: Finished dracut cmdline hook.
Dec 11 00:08:47 np0005554845 systemd[1]: Starting dracut pre-udev hook...
Dec 11 00:08:47 np0005554845 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log.
Dec 11 00:08:47 np0005554845 kernel: device-mapper: uevent: version 1.0.3
Dec 11 00:08:47 np0005554845 kernel: device-mapper: ioctl: 4.50.0-ioctl (2025-04-28) initialised: dm-devel@lists.linux.dev
Dec 11 00:08:47 np0005554845 kernel: RPC: Registered named UNIX socket transport module.
Dec 11 00:08:47 np0005554845 kernel: RPC: Registered udp transport module.
Dec 11 00:08:47 np0005554845 kernel: RPC: Registered tcp transport module.
Dec 11 00:08:47 np0005554845 kernel: RPC: Registered tcp-with-tls transport module.
Dec 11 00:08:47 np0005554845 kernel: RPC: Registered tcp NFSv4.1 backchannel transport module.
Dec 11 00:08:47 np0005554845 rpc.statd[441]: Version 2.5.4 starting
Dec 11 00:08:47 np0005554845 rpc.statd[441]: Initializing NSM state
Dec 11 00:08:47 np0005554845 rpc.idmapd[446]: Setting log level to 0
Dec 11 00:08:47 np0005554845 systemd[1]: Finished dracut pre-udev hook.
Dec 11 00:08:47 np0005554845 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 11 00:08:48 np0005554845 systemd-udevd[459]: Using default interface naming scheme 'rhel-9.0'.
Dec 11 00:08:48 np0005554845 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 11 00:08:48 np0005554845 systemd[1]: Starting dracut pre-trigger hook...
Dec 11 00:08:48 np0005554845 systemd[1]: Finished dracut pre-trigger hook.
Dec 11 00:08:48 np0005554845 systemd[1]: Starting Coldplug All udev Devices...
Dec 11 00:08:48 np0005554845 systemd[1]: Created slice Slice /system/modprobe.
Dec 11 00:08:48 np0005554845 systemd[1]: Starting Load Kernel Module configfs...
Dec 11 00:08:48 np0005554845 systemd[1]: Finished Coldplug All udev Devices.
Dec 11 00:08:48 np0005554845 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 00:08:48 np0005554845 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 00:08:48 np0005554845 systemd[1]: Mounting Kernel Configuration File System...
Dec 11 00:08:48 np0005554845 systemd[1]: nm-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Network.
Dec 11 00:08:48 np0005554845 systemd[1]: nm-wait-online-initrd.service was skipped because of an unmet condition check (ConditionPathExists=/run/NetworkManager/initrd/neednet).
Dec 11 00:08:48 np0005554845 systemd[1]: Starting dracut initqueue hook...
Dec 11 00:08:48 np0005554845 systemd[1]: Mounted Kernel Configuration File System.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target System Initialization.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Basic System.
Dec 11 00:08:48 np0005554845 kernel: virtio_blk virtio2: 8/0/0 default/read/poll queues
Dec 11 00:08:48 np0005554845 kernel: virtio_blk virtio2: [vda] 167772160 512-byte logical blocks (85.9 GB/80.0 GiB)
Dec 11 00:08:48 np0005554845 kernel: vda: vda1
Dec 11 00:08:48 np0005554845 systemd-udevd[475]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:08:48 np0005554845 kernel: scsi host0: ata_piix
Dec 11 00:08:48 np0005554845 kernel: scsi host1: ata_piix
Dec 11 00:08:48 np0005554845 kernel: ata1: PATA max MWDMA2 cmd 0x1f0 ctl 0x3f6 bmdma 0xc140 irq 14 lpm-pol 0
Dec 11 00:08:48 np0005554845 kernel: ata2: PATA max MWDMA2 cmd 0x170 ctl 0x376 bmdma 0xc148 irq 15 lpm-pol 0
Dec 11 00:08:48 np0005554845 systemd[1]: Found device /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Initrd Root Device.
Dec 11 00:08:48 np0005554845 kernel: ata1: found unknown device (class 0)
Dec 11 00:08:48 np0005554845 kernel: ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100
Dec 11 00:08:48 np0005554845 kernel: scsi 0:0:0:0: CD-ROM            QEMU     QEMU DVD-ROM     2.5+ PQ: 0 ANSI: 5
Dec 11 00:08:48 np0005554845 kernel: scsi 0:0:0:0: Attached scsi generic sg0 type 5
Dec 11 00:08:48 np0005554845 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray
Dec 11 00:08:48 np0005554845 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20
Dec 11 00:08:48 np0005554845 systemd[1]: Finished dracut initqueue hook.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Preparation for Remote File Systems.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Remote Encrypted Volumes.
Dec 11 00:08:48 np0005554845 systemd[1]: Reached target Remote File Systems.
Dec 11 00:08:48 np0005554845 systemd[1]: Starting dracut pre-mount hook...
Dec 11 00:08:48 np0005554845 systemd[1]: Finished dracut pre-mount hook.
Dec 11 00:08:48 np0005554845 systemd[1]: Starting File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266...
Dec 11 00:08:48 np0005554845 systemd-fsck[553]: /usr/sbin/fsck.xfs: XFS file system.
Dec 11 00:08:48 np0005554845 systemd[1]: Finished File System Check on /dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266.
Dec 11 00:08:48 np0005554845 systemd[1]: Mounting /sysroot...
Dec 11 00:08:49 np0005554845 kernel: SGI XFS with ACLs, security attributes, scrub, quota, no debug enabled
Dec 11 00:08:49 np0005554845 kernel: XFS (vda1): Mounting V5 Filesystem cbdedf45-ed1d-4952-82a8-33a12c0ba266
Dec 11 00:08:49 np0005554845 kernel: XFS (vda1): Ending clean mount
Dec 11 00:08:49 np0005554845 systemd[1]: Mounted /sysroot.
Dec 11 00:08:49 np0005554845 systemd[1]: Reached target Initrd Root File System.
Dec 11 00:08:49 np0005554845 systemd[1]: Starting Mountpoints Configured in the Real Root...
Dec 11 00:08:49 np0005554845 systemd[1]: initrd-parse-etc.service: Deactivated successfully.
Dec 11 00:08:49 np0005554845 systemd[1]: Finished Mountpoints Configured in the Real Root.
Dec 11 00:08:49 np0005554845 systemd[1]: Reached target Initrd File Systems.
Dec 11 00:08:49 np0005554845 systemd[1]: Reached target Initrd Default Target.
Dec 11 00:08:49 np0005554845 systemd[1]: Starting dracut mount hook...
Dec 11 00:08:49 np0005554845 systemd[1]: Finished dracut mount hook.
Dec 11 00:08:49 np0005554845 systemd[1]: Starting dracut pre-pivot and cleanup hook...
Dec 11 00:08:50 np0005554845 rpc.idmapd[446]: exiting on signal 15
Dec 11 00:08:50 np0005554845 systemd[1]: var-lib-nfs-rpc_pipefs.mount: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Finished dracut pre-pivot and cleanup hook.
Dec 11 00:08:50 np0005554845 systemd[1]: Starting Cleaning Up and Shutting Down Daemons...
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Network.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Remote Encrypted Volumes.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Timer Units.
Dec 11 00:08:50 np0005554845 systemd[1]: dbus.socket: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Closed D-Bus System Message Bus Socket.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-pre-pivot.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut pre-pivot and cleanup hook.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Initrd Default Target.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Basic System.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Initrd Root Device.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Initrd /usr File System.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Path Units.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Remote File Systems.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Preparation for Remote File Systems.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Slice Units.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Socket Units.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target System Initialization.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Local File Systems.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Swaps.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-mount.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut mount hook.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-pre-mount.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut pre-mount hook.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped target Local Encrypted Volumes.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-ask-password-console.path: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Dispatch Password Requests to Console Directory Watch.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-initqueue.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut initqueue hook.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Apply Kernel Variables.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Create Volatile Files and Directories.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-udev-trigger.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Coldplug All udev Devices.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-pre-trigger.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut pre-trigger hook.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopping Rule-based Manager for Device Events and Files...
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Setup Virtual Console.
Dec 11 00:08:50 np0005554845 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup.service.mount: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: initrd-cleanup.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Finished Cleaning Up and Shutting Down Daemons.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-udevd.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Rule-based Manager for Device Events and Files.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-udevd-control.socket: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Closed udev Control Socket.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Closed udev Kernel Socket.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-pre-udev.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut pre-udev hook.
Dec 11 00:08:50 np0005554845 systemd[1]: dracut-cmdline.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped dracut cmdline hook.
Dec 11 00:08:50 np0005554845 systemd[1]: Starting Cleanup udev Database...
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Create Static Device Nodes in /dev.
Dec 11 00:08:50 np0005554845 systemd[1]: kmod-static-nodes.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Create List of Static Device Nodes.
Dec 11 00:08:50 np0005554845 systemd[1]: systemd-sysusers.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Stopped Create System Users.
Dec 11 00:08:50 np0005554845 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dsetup\x2ddev.service.mount: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: run-credentials-systemd\x2dsysusers.service.mount: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully.
Dec 11 00:08:50 np0005554845 systemd[1]: Finished Cleanup udev Database.
Dec 11 00:08:50 np0005554845 systemd[1]: Reached target Switch Root.
Dec 11 00:08:50 np0005554845 systemd[1]: Starting Switch Root...
Dec 11 00:08:50 np0005554845 systemd[1]: Switching root.
Dec 11 00:08:50 np0005554845 systemd-journald[305]: Journal stopped
Dec 11 00:08:51 np0005554845 systemd-journald: Received SIGTERM from PID 1 (systemd).
Dec 11 00:08:51 np0005554845 kernel: audit: type=1404 audit(1765429730.382:2): enforcing=1 old_enforcing=0 auid=4294967295 ses=4294967295 enabled=1 old-enabled=1 lsm=selinux res=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:08:51 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:08:51 np0005554845 kernel: audit: type=1403 audit(1765429730.502:3): auid=4294967295 ses=4294967295 lsm=selinux res=1
Dec 11 00:08:51 np0005554845 systemd: Successfully loaded SELinux policy in 121.528ms.
Dec 11 00:08:51 np0005554845 systemd: Relabelled /dev, /dev/shm, /run, /sys/fs/cgroup in 24.822ms.
Dec 11 00:08:51 np0005554845 systemd: systemd 252-59.el9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +SMACK +SECCOMP +GCRYPT +GNUTLS +OPENSSL +ACL +BLKID +CURL +ELFUTILS +FIDO2 +IDN2 -IDN -IPTC +KMOD +LIBCRYPTSETUP +LIBFDISK +PCRE2 -PWQUALITY +P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK +XKBCOMMON +UTMP +SYSVINIT default-hierarchy=unified)
Dec 11 00:08:51 np0005554845 systemd: Detected virtualization kvm.
Dec 11 00:08:51 np0005554845 systemd: Detected architecture x86-64.
Dec 11 00:08:51 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:08:51 np0005554845 systemd: initrd-switch-root.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd: Stopped Switch Root.
Dec 11 00:08:51 np0005554845 systemd: systemd-journald.service: Scheduled restart job, restart counter is at 1.
Dec 11 00:08:51 np0005554845 systemd: Created slice Slice /system/getty.
Dec 11 00:08:51 np0005554845 systemd: Created slice Slice /system/serial-getty.
Dec 11 00:08:51 np0005554845 systemd: Created slice Slice /system/sshd-keygen.
Dec 11 00:08:51 np0005554845 systemd: Created slice User and Session Slice.
Dec 11 00:08:51 np0005554845 systemd: Started Dispatch Password Requests to Console Directory Watch.
Dec 11 00:08:51 np0005554845 systemd: Started Forward Password Requests to Wall Directory Watch.
Dec 11 00:08:51 np0005554845 systemd: Set up automount Arbitrary Executable File Formats File System Automount Point.
Dec 11 00:08:51 np0005554845 systemd: Reached target Local Encrypted Volumes.
Dec 11 00:08:51 np0005554845 systemd: Stopped target Switch Root.
Dec 11 00:08:51 np0005554845 systemd: Stopped target Initrd File Systems.
Dec 11 00:08:51 np0005554845 systemd: Stopped target Initrd Root File System.
Dec 11 00:08:51 np0005554845 systemd: Reached target Local Integrity Protected Volumes.
Dec 11 00:08:51 np0005554845 systemd: Reached target Path Units.
Dec 11 00:08:51 np0005554845 systemd: Reached target rpc_pipefs.target.
Dec 11 00:08:51 np0005554845 systemd: Reached target Slice Units.
Dec 11 00:08:51 np0005554845 systemd: Reached target Swaps.
Dec 11 00:08:51 np0005554845 systemd: Reached target Local Verity Protected Volumes.
Dec 11 00:08:51 np0005554845 systemd: Listening on RPCbind Server Activation Socket.
Dec 11 00:08:51 np0005554845 systemd: Reached target RPC Port Mapper.
Dec 11 00:08:51 np0005554845 systemd: Listening on Process Core Dump Socket.
Dec 11 00:08:51 np0005554845 systemd: Listening on initctl Compatibility Named Pipe.
Dec 11 00:08:51 np0005554845 systemd: Listening on udev Control Socket.
Dec 11 00:08:51 np0005554845 systemd: Listening on udev Kernel Socket.
Dec 11 00:08:51 np0005554845 systemd: Mounting Huge Pages File System...
Dec 11 00:08:51 np0005554845 systemd: Mounting POSIX Message Queue File System...
Dec 11 00:08:51 np0005554845 systemd: Mounting Kernel Debug File System...
Dec 11 00:08:51 np0005554845 systemd: Mounting Kernel Trace File System...
Dec 11 00:08:51 np0005554845 systemd: Kernel Module supporting RPCSEC_GSS was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 11 00:08:51 np0005554845 systemd: Starting Create List of Static Device Nodes...
Dec 11 00:08:51 np0005554845 systemd: Starting Load Kernel Module configfs...
Dec 11 00:08:51 np0005554845 systemd: Starting Load Kernel Module drm...
Dec 11 00:08:51 np0005554845 systemd: Starting Load Kernel Module efi_pstore...
Dec 11 00:08:51 np0005554845 systemd: Starting Load Kernel Module fuse...
Dec 11 00:08:51 np0005554845 systemd: Starting Read and set NIS domainname from /etc/sysconfig/network...
Dec 11 00:08:51 np0005554845 systemd: systemd-fsck-root.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd: Stopped File System Check on Root Device.
Dec 11 00:08:51 np0005554845 systemd: Stopped Journal Service.
Dec 11 00:08:51 np0005554845 kernel: fuse: init (API version 7.37)
Dec 11 00:08:51 np0005554845 systemd: Starting Journal Service...
Dec 11 00:08:51 np0005554845 systemd: Load Kernel Modules was skipped because no trigger condition checks were met.
Dec 11 00:08:51 np0005554845 systemd: Starting Generate network units from Kernel command line...
Dec 11 00:08:51 np0005554845 systemd: TPM2 PCR Machine ID Measurement was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 00:08:51 np0005554845 systemd: Starting Remount Root and Kernel File Systems...
Dec 11 00:08:51 np0005554845 systemd: Repartition Root Disk was skipped because no trigger condition checks were met.
Dec 11 00:08:51 np0005554845 systemd: Starting Apply Kernel Variables...
Dec 11 00:08:51 np0005554845 systemd: Starting Coldplug All udev Devices...
Dec 11 00:08:51 np0005554845 kernel: xfs filesystem being remounted at / supports timestamps until 2038 (0x7fffffff)
Dec 11 00:08:51 np0005554845 systemd-journald[677]: Journal started
Dec 11 00:08:51 np0005554845 systemd-journald[677]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 11 00:08:51 np0005554845 systemd[1]: Queued start job for default target Multi-User System.
Dec 11 00:08:51 np0005554845 systemd[1]: systemd-journald.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd: Started Journal Service.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounted Huge Pages File System.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounted POSIX Message Queue File System.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounted Kernel Debug File System.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounted Kernel Trace File System.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Create List of Static Device Nodes.
Dec 11 00:08:51 np0005554845 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 00:08:51 np0005554845 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load Kernel Module efi_pstore.
Dec 11 00:08:51 np0005554845 systemd[1]: modprobe@fuse.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load Kernel Module fuse.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Read and set NIS domainname from /etc/sysconfig/network.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Generate network units from Kernel command line.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Remount Root and Kernel File Systems.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Apply Kernel Variables.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounting FUSE Control File System...
Dec 11 00:08:51 np0005554845 kernel: ACPI: bus type drm_connector registered
Dec 11 00:08:51 np0005554845 systemd[1]: First Boot Wizard was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Rebuild Hardware Database...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Flush Journal to Persistent Storage...
Dec 11 00:08:51 np0005554845 systemd[1]: Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore).
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Load/Save OS Random Seed...
Dec 11 00:08:51 np0005554845 systemd-journald[677]: Runtime Journal (/run/log/journal/64f1d6692049d8be5e8b216cc203502c) is 8.0M, max 153.6M, 145.6M free.
Dec 11 00:08:51 np0005554845 systemd-journald[677]: Received client request to flush runtime journal.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Create System Users...
Dec 11 00:08:51 np0005554845 systemd[1]: modprobe@drm.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load Kernel Module drm.
Dec 11 00:08:51 np0005554845 systemd[1]: Mounted FUSE Control File System.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Flush Journal to Persistent Storage.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Coldplug All udev Devices.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load/Save OS Random Seed.
Dec 11 00:08:51 np0005554845 systemd[1]: First Boot Complete was skipped because of an unmet condition check (ConditionFirstBoot=yes).
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Create System Users.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Create Static Device Nodes in /dev...
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Create Static Device Nodes in /dev.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target Preparation for Local File Systems.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target Local File Systems.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Rebuild Dynamic Linker Cache...
Dec 11 00:08:51 np0005554845 systemd[1]: Mark the need to relabel after reboot was skipped because of an unmet condition check (ConditionSecurity=!selinux).
Dec 11 00:08:51 np0005554845 systemd[1]: Set Up Additional Binary Formats was skipped because no trigger condition checks were met.
Dec 11 00:08:51 np0005554845 systemd[1]: Update Boot Loader Random Seed was skipped because no trigger condition checks were met.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Automatic Boot Loader Update...
Dec 11 00:08:51 np0005554845 systemd[1]: Commit a transient machine-id on disk was skipped because of an unmet condition check (ConditionPathIsMountPoint=/etc/machine-id).
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Create Volatile Files and Directories...
Dec 11 00:08:51 np0005554845 bootctl[695]: Couldn't find EFI system partition, skipping.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Automatic Boot Loader Update.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Create Volatile Files and Directories.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Security Auditing Service...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting RPC Bind...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Rebuild Journal Catalog...
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Rebuild Dynamic Linker Cache.
Dec 11 00:08:51 np0005554845 auditd[701]: audit dispatcher initialized with q_depth=2000 and 1 active plugins
Dec 11 00:08:51 np0005554845 auditd[701]: Init complete, auditd 3.1.5 listening for events (startup state enable)
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Rebuild Journal Catalog.
Dec 11 00:08:51 np0005554845 systemd[1]: Started RPC Bind.
Dec 11 00:08:51 np0005554845 augenrules[706]: /sbin/augenrules: No change
Dec 11 00:08:51 np0005554845 augenrules[721]: No rules
Dec 11 00:08:51 np0005554845 augenrules[721]: enabled 1
Dec 11 00:08:51 np0005554845 augenrules[721]: failure 1
Dec 11 00:08:51 np0005554845 augenrules[721]: pid 701
Dec 11 00:08:51 np0005554845 augenrules[721]: rate_limit 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_limit 8192
Dec 11 00:08:51 np0005554845 augenrules[721]: lost 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time 60000
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time_actual 0
Dec 11 00:08:51 np0005554845 augenrules[721]: enabled 1
Dec 11 00:08:51 np0005554845 augenrules[721]: failure 1
Dec 11 00:08:51 np0005554845 augenrules[721]: pid 701
Dec 11 00:08:51 np0005554845 augenrules[721]: rate_limit 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_limit 8192
Dec 11 00:08:51 np0005554845 augenrules[721]: lost 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time 60000
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time_actual 0
Dec 11 00:08:51 np0005554845 augenrules[721]: enabled 1
Dec 11 00:08:51 np0005554845 augenrules[721]: failure 1
Dec 11 00:08:51 np0005554845 augenrules[721]: pid 701
Dec 11 00:08:51 np0005554845 augenrules[721]: rate_limit 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_limit 8192
Dec 11 00:08:51 np0005554845 augenrules[721]: lost 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog 0
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time 60000
Dec 11 00:08:51 np0005554845 augenrules[721]: backlog_wait_time_actual 0
Dec 11 00:08:51 np0005554845 systemd[1]: Started Security Auditing Service.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Record System Boot/Shutdown in UTMP...
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Record System Boot/Shutdown in UTMP.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Rebuild Hardware Database.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Rule-based Manager for Device Events and Files...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Update is Completed...
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Update is Completed.
Dec 11 00:08:51 np0005554845 systemd-udevd[729]: Using default interface naming scheme 'rhel-9.0'.
Dec 11 00:08:51 np0005554845 systemd[1]: Started Rule-based Manager for Device Events and Files.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target System Initialization.
Dec 11 00:08:51 np0005554845 systemd[1]: Started dnf makecache --timer.
Dec 11 00:08:51 np0005554845 systemd[1]: Started Daily rotation of log files.
Dec 11 00:08:51 np0005554845 systemd[1]: Started Daily Cleanup of Temporary Directories.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target Timer Units.
Dec 11 00:08:51 np0005554845 systemd[1]: Listening on D-Bus System Message Bus Socket.
Dec 11 00:08:51 np0005554845 systemd[1]: Listening on SSSD Kerberos Cache Manager responder socket.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target Socket Units.
Dec 11 00:08:51 np0005554845 systemd[1]: Starting D-Bus System Message Bus...
Dec 11 00:08:51 np0005554845 systemd[1]: TPM2 PCR Barrier (Initialization) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Load Kernel Module configfs...
Dec 11 00:08:51 np0005554845 systemd[1]: modprobe@configfs.service: Deactivated successfully.
Dec 11 00:08:51 np0005554845 systemd[1]: Finished Load Kernel Module configfs.
Dec 11 00:08:51 np0005554845 systemd[1]: Started D-Bus System Message Bus.
Dec 11 00:08:51 np0005554845 systemd-udevd[747]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:08:51 np0005554845 systemd[1]: Reached target Basic System.
Dec 11 00:08:51 np0005554845 dbus-broker-lau[767]: Ready
Dec 11 00:08:51 np0005554845 systemd[1]: Starting NTP client/server...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Cloud-init: Local Stage (pre-network)...
Dec 11 00:08:51 np0005554845 systemd[1]: Starting Restore /run/initramfs on shutdown...
Dec 11 00:08:52 np0005554845 systemd[1]: Starting IPv4 firewall with iptables...
Dec 11 00:08:52 np0005554845 kernel: input: PC Speaker as /devices/platform/pcspkr/input/input6
Dec 11 00:08:52 np0005554845 systemd[1]: Started irqbalance daemon.
Dec 11 00:08:52 np0005554845 systemd[1]: Load CPU microcode update was skipped because of an unmet condition check (ConditionPathExists=/sys/devices/system/cpu/microcode/reload).
Dec 11 00:08:52 np0005554845 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:08:52 np0005554845 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:08:52 np0005554845 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:08:52 np0005554845 systemd[1]: Reached target sshd-keygen.target.
Dec 11 00:08:52 np0005554845 systemd[1]: System Security Services Daemon was skipped because no trigger condition checks were met.
Dec 11 00:08:52 np0005554845 systemd[1]: Reached target User and Group Name Lookups.
Dec 11 00:08:52 np0005554845 kernel: piix4_smbus 0000:00:01.3: SMBus Host Controller at 0x700, revision 0
Dec 11 00:08:52 np0005554845 kernel: i2c i2c-0: 1/1 memory slots populated (from DMI)
Dec 11 00:08:52 np0005554845 kernel: i2c i2c-0: Memory type 0x07 not supported yet, not instantiating SPD
Dec 11 00:08:52 np0005554845 systemd[1]: Starting User Login Management...
Dec 11 00:08:52 np0005554845 systemd[1]: Finished Restore /run/initramfs on shutdown.
Dec 11 00:08:52 np0005554845 systemd[1]: Condition check resulted in /dev/ttyS0 being skipped.
Dec 11 00:08:52 np0005554845 kernel: Warning: Deprecated Driver is detected: nft_compat will not be maintained in a future major release and may be disabled
Dec 11 00:08:52 np0005554845 kernel: Warning: Deprecated Driver is detected: nft_compat_module_init will not be maintained in a future major release and may be disabled
Dec 11 00:08:52 np0005554845 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 11 00:08:52 np0005554845 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 11 00:08:52 np0005554845 systemd-logind[789]: New seat seat0.
Dec 11 00:08:52 np0005554845 chronyd[799]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 11 00:08:52 np0005554845 systemd[1]: Started User Login Management.
Dec 11 00:08:52 np0005554845 chronyd[799]: Loaded 0 symmetric keys
Dec 11 00:08:52 np0005554845 chronyd[799]: Using right/UTC timezone to obtain leap second data
Dec 11 00:08:52 np0005554845 chronyd[799]: Loaded seccomp filter (level 2)
Dec 11 00:08:52 np0005554845 kernel: [drm] pci: virtio-vga detected at 0000:00:02.0
Dec 11 00:08:52 np0005554845 kernel: virtio-pci 0000:00:02.0: vgaarb: deactivate vga console
Dec 11 00:08:52 np0005554845 systemd[1]: Started NTP client/server.
Dec 11 00:08:52 np0005554845 kernel: Console: switching to colour dummy device 80x25
Dec 11 00:08:52 np0005554845 kernel: [drm] features: -virgl +edid -resource_blob -host_visible
Dec 11 00:08:52 np0005554845 kernel: [drm] features: -context_init
Dec 11 00:08:52 np0005554845 kernel: [drm] number of scanouts: 1
Dec 11 00:08:52 np0005554845 kernel: [drm] number of cap sets: 0
Dec 11 00:08:52 np0005554845 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:02.0 on minor 0
Dec 11 00:08:52 np0005554845 kernel: fbcon: virtio_gpudrmfb (fb0) is primary device
Dec 11 00:08:52 np0005554845 kernel: Console: switching to colour frame buffer device 128x48
Dec 11 00:08:52 np0005554845 kernel: virtio-pci 0000:00:02.0: [drm] fb0: virtio_gpudrmfb frame buffer device
Dec 11 00:08:52 np0005554845 kernel: kvm_amd: TSC scaling supported
Dec 11 00:08:52 np0005554845 kernel: kvm_amd: Nested Virtualization enabled
Dec 11 00:08:52 np0005554845 kernel: kvm_amd: Nested Paging enabled
Dec 11 00:08:52 np0005554845 kernel: kvm_amd: LBR virtualization supported
Dec 11 00:08:52 np0005554845 iptables.init[778]: iptables: Applying firewall rules: [  OK  ]
Dec 11 00:08:52 np0005554845 systemd[1]: Finished IPv4 firewall with iptables.
Dec 11 00:08:52 np0005554845 cloud-init[837]: Cloud-init v. 24.4-7.el9 running 'init-local' at Thu, 11 Dec 2025 05:08:52 +0000. Up 7.15 seconds.
Dec 11 00:08:52 np0005554845 systemd[1]: run-cloud\x2dinit-tmp-tmpa7p2q7sz.mount: Deactivated successfully.
Dec 11 00:08:52 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 00:08:52 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 00:08:52 np0005554845 systemd-hostnamed[852]: Hostname set to <np0005554845.novalocal> (static)
Dec 11 00:08:53 np0005554845 systemd[1]: Finished Cloud-init: Local Stage (pre-network).
Dec 11 00:08:53 np0005554845 systemd[1]: Reached target Preparation for Network.
Dec 11 00:08:53 np0005554845 systemd[1]: Starting Network Manager...
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.1848] NetworkManager (version 1.54.2-1.el9) is starting... (boot:6fd4de4e-7b3d-47fa-92ea-8323052e6a02)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.1855] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.1969] manager[0x557a178ee000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2022] hostname: hostname: using hostnamed
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2023] hostname: static hostname changed from (none) to "np0005554845.novalocal"
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2030] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2202] manager[0x557a178ee000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2204] manager[0x557a178ee000]: rfkill: WWAN hardware radio set enabled
Dec 11 00:08:53 np0005554845 systemd[1]: Listening on Load/Save RF Kill Switch Status /dev/rfkill Watch.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2277] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2278] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2279] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2280] manager: Networking is enabled by state file
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2284] settings: Loaded settings plugin: keyfile (internal)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2301] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2338] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2355] dhcp: init: Using DHCP client 'internal'
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2362] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2388] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2400] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2413] device (lo): Activation: starting connection 'lo' (98463273-a93b-4216-b1e6-11bf6b9079be)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2428] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2434] device (eth0): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2472] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2479] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2484] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2488] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2492] device (eth0): carrier: link connected
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2498] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2528] device (eth0): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2543] policy: auto-activating connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2553] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2557] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2561] manager: NetworkManager state is now CONNECTING
Dec 11 00:08:53 np0005554845 systemd[1]: Started Network Manager.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2564] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2579] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2585] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:08:53 np0005554845 systemd[1]: Reached target Network.
Dec 11 00:08:53 np0005554845 systemd[1]: Starting Network Manager Wait Online...
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2661] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2673] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2702] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 systemd[1]: Starting GSSAPI Proxy Daemon...
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2750] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2752] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 00:08:53 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2758] device (lo): Activation: successful, device activated.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2794] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2796] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2801] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2804] device (eth0): Activation: successful, device activated.
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2812] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 00:08:53 np0005554845 NetworkManager[856]: <info>  [1765429733.2816] manager: startup complete
Dec 11 00:08:53 np0005554845 systemd[1]: Started GSSAPI Proxy Daemon.
Dec 11 00:08:53 np0005554845 systemd[1]: RPC security service for NFS client and server was skipped because of an unmet condition check (ConditionPathExists=/etc/krb5.keytab).
Dec 11 00:08:53 np0005554845 systemd[1]: Reached target NFS client services.
Dec 11 00:08:53 np0005554845 systemd[1]: Reached target Preparation for Remote File Systems.
Dec 11 00:08:53 np0005554845 systemd[1]: Reached target Remote File Systems.
Dec 11 00:08:53 np0005554845 systemd[1]: TPM2 PCR Barrier (User) was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/StubPcrKernelImage-4a67b082-0a4c-41cf-b6c7-440b29bb8c4f).
Dec 11 00:08:53 np0005554845 systemd[1]: Finished Network Manager Wait Online.
Dec 11 00:08:53 np0005554845 systemd[1]: Starting Cloud-init: Network Stage...
Dec 11 00:08:54 np0005554845 cloud-init[919]: Cloud-init v. 24.4-7.el9 running 'init' at Thu, 11 Dec 2025 05:08:54 +0000. Up 8.98 seconds.
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++++++++Net device info+++++++++++++++++++++++++++++++++++++++
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: | Device |  Up  |           Address            |      Mask     | Scope  |     Hw-Address    |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |  eth0  | True |         38.102.83.9          | 255.255.255.0 | global | fa:16:3e:95:7a:1e |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |  eth0  | True | fe80::f816:3eff:fe95:7a1e/64 |       .       |  link  | fa:16:3e:95:7a:1e |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   lo   | True |          127.0.0.1           |   255.0.0.0   |  host  |         .         |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   lo   | True |           ::1/128            |       .       |  host  |         .         |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +--------+------+------------------------------+---------------+--------+-------------------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +++++++++++++++++++++++++++++++++Route IPv4 info+++++++++++++++++++++++++++++++++
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: | Route |   Destination   |    Gateway    |     Genmask     | Interface | Flags |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   0   |     0.0.0.0     |  38.102.83.1  |     0.0.0.0     |    eth0   |   UG  |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   1   |   38.102.83.0   |    0.0.0.0    |  255.255.255.0  |    eth0   |   U   |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   2   | 169.254.169.254 | 38.102.83.126 | 255.255.255.255 |    eth0   |  UGH  |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-----------------+---------------+-----------------+-----------+-------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +++++++++++++++++++Route IPv6 info+++++++++++++++++++
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: | Route | Destination | Gateway | Interface | Flags |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   1   |  fe80::/64  |    ::   |    eth0   |   U   |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: |   3   |  multicast  |    ::   |    eth0   |   U   |
Dec 11 00:08:54 np0005554845 cloud-init[919]: ci-info: +-------+-------------+---------+-----------+-------+
Dec 11 00:08:55 np0005554845 cloud-init[919]: Generating public/private rsa key pair.
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_rsa_key
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_rsa_key.pub
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key fingerprint is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: SHA256:b0eyPPOmLk8WcK4pi5nM0Lg6HZNoZQUxbXAlksl77Zw root@np0005554845.novalocal
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key's randomart image is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: +---[RSA 3072]----+
Dec 11 00:08:55 np0005554845 cloud-init[919]: |  .*Bo..         |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |   +++.          |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |    + . . .      |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |   + . . +       |
Dec 11 00:08:55 np0005554845 cloud-init[919]: | .o.. o S + .    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |..+o   E = =     |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |..oo. . o X .    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |. .= + o.+ =.    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |.o. * .  ++o.    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: +----[SHA256]-----+
Dec 11 00:08:55 np0005554845 cloud-init[919]: Generating public/private ecdsa key pair.
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ecdsa_key
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ecdsa_key.pub
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key fingerprint is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: SHA256:QLtY27q8DvCpVfxsCVfbkm3DFBTNEb5FnoOHDDY7sZ0 root@np0005554845.novalocal
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key's randomart image is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: +---[ECDSA 256]---+
Dec 11 00:08:55 np0005554845 cloud-init[919]: |      .    *++oo.|
Dec 11 00:08:55 np0005554845 cloud-init[919]: |     . .  . Oo*o.|
Dec 11 00:08:55 np0005554845 cloud-init[919]: |      +   .+.E.+o|
Dec 11 00:08:55 np0005554845 cloud-init[919]: |     + = . B. .o.|
Dec 11 00:08:55 np0005554845 cloud-init[919]: |  . . * S + * .  |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |   o o * . o .   |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |    = . =        |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |   o o o         |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |  .  .=.         |
Dec 11 00:08:55 np0005554845 cloud-init[919]: +----[SHA256]-----+
Dec 11 00:08:55 np0005554845 cloud-init[919]: Generating public/private ed25519 key pair.
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your identification has been saved in /etc/ssh/ssh_host_ed25519_key
Dec 11 00:08:55 np0005554845 cloud-init[919]: Your public key has been saved in /etc/ssh/ssh_host_ed25519_key.pub
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key fingerprint is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: SHA256:zZk+iC6HaDyuLGeYEPJXbVlkMDp4nNfYl69xqaIYbvA root@np0005554845.novalocal
Dec 11 00:08:55 np0005554845 cloud-init[919]: The key's randomart image is:
Dec 11 00:08:55 np0005554845 cloud-init[919]: +--[ED25519 256]--+
Dec 11 00:08:55 np0005554845 cloud-init[919]: |        ooo      |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |     o o *.  .   |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |    . *.ooo o    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |o    ..o+o + . . |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |.o   . .S = . +  |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |. . o  . o   =   |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |.+ o +o . + o    |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |= B ooEo . o     |
Dec 11 00:08:55 np0005554845 cloud-init[919]: |oB.. ++ .        |
Dec 11 00:08:55 np0005554845 cloud-init[919]: +----[SHA256]-----+
Dec 11 00:08:56 np0005554845 systemd[1]: Finished Cloud-init: Network Stage.
Dec 11 00:08:56 np0005554845 systemd[1]: Reached target Cloud-config availability.
Dec 11 00:08:56 np0005554845 systemd[1]: Reached target Network is Online.
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Cloud-init: Config Stage...
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Crash recovery kernel arming...
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Notify NFS peers of a restart...
Dec 11 00:08:56 np0005554845 sm-notify[1001]: Version 2.5.4 starting
Dec 11 00:08:56 np0005554845 systemd[1]: Starting System Logging Service...
Dec 11 00:08:56 np0005554845 systemd[1]: Starting OpenSSH server daemon...
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Permit User Sessions...
Dec 11 00:08:56 np0005554845 systemd[1]: Started Notify NFS peers of a restart.
Dec 11 00:08:56 np0005554845 systemd[1]: Started OpenSSH server daemon.
Dec 11 00:08:56 np0005554845 systemd[1]: Finished Permit User Sessions.
Dec 11 00:08:56 np0005554845 systemd[1]: Started Command Scheduler.
Dec 11 00:08:56 np0005554845 systemd[1]: Started Getty on tty1.
Dec 11 00:08:56 np0005554845 systemd[1]: Started Serial Getty on ttyS0.
Dec 11 00:08:56 np0005554845 systemd[1]: Reached target Login Prompts.
Dec 11 00:08:56 np0005554845 rsyslogd[1002]: [origin software="rsyslogd" swVersion="8.2510.0-2.el9" x-pid="1002" x-info="https://www.rsyslog.com"] start
Dec 11 00:08:56 np0005554845 rsyslogd[1002]: imjournal: No statefile exists, /var/lib/rsyslog/imjournal.state will be created (ignore if this is first run): No such file or directory [v8.2510.0-2.el9 try https://www.rsyslog.com/e/2040 ]
Dec 11 00:08:56 np0005554845 systemd[1]: Started System Logging Service.
Dec 11 00:08:56 np0005554845 systemd[1]: Reached target Multi-User System.
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Record Runlevel Change in UTMP...
Dec 11 00:08:56 np0005554845 systemd[1]: systemd-update-utmp-runlevel.service: Deactivated successfully.
Dec 11 00:08:56 np0005554845 systemd[1]: Finished Record Runlevel Change in UTMP.
Dec 11 00:08:56 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:08:56 np0005554845 kdumpctl[1013]: kdump: No kdump initial ramdisk found.
Dec 11 00:08:56 np0005554845 kdumpctl[1013]: kdump: Rebuilding /boot/initramfs-5.14.0-648.el9.x86_64kdump.img
Dec 11 00:08:56 np0005554845 cloud-init[1129]: Cloud-init v. 24.4-7.el9 running 'modules:config' at Thu, 11 Dec 2025 05:08:56 +0000. Up 11.03 seconds.
Dec 11 00:08:56 np0005554845 systemd[1]: Finished Cloud-init: Config Stage.
Dec 11 00:08:56 np0005554845 systemd[1]: Starting Cloud-init: Final Stage...
Dec 11 00:08:56 np0005554845 dracut[1280]: dracut-057-102.git20250818.el9
Dec 11 00:08:56 np0005554845 cloud-init[1298]: Cloud-init v. 24.4-7.el9 running 'modules:final' at Thu, 11 Dec 2025 05:08:56 +0000. Up 11.44 seconds.
Dec 11 00:08:56 np0005554845 cloud-init[1300]: #############################################################
Dec 11 00:08:56 np0005554845 cloud-init[1301]: -----BEGIN SSH HOST KEY FINGERPRINTS-----
Dec 11 00:08:56 np0005554845 cloud-init[1304]: 256 SHA256:QLtY27q8DvCpVfxsCVfbkm3DFBTNEb5FnoOHDDY7sZ0 root@np0005554845.novalocal (ECDSA)
Dec 11 00:08:56 np0005554845 cloud-init[1309]: 256 SHA256:zZk+iC6HaDyuLGeYEPJXbVlkMDp4nNfYl69xqaIYbvA root@np0005554845.novalocal (ED25519)
Dec 11 00:08:56 np0005554845 cloud-init[1316]: 3072 SHA256:b0eyPPOmLk8WcK4pi5nM0Lg6HZNoZQUxbXAlksl77Zw root@np0005554845.novalocal (RSA)
Dec 11 00:08:56 np0005554845 dracut[1283]: Executing: /usr/bin/dracut --quiet --hostonly --hostonly-cmdline --hostonly-i18n --hostonly-mode strict --hostonly-nics  --mount "/dev/disk/by-uuid/cbdedf45-ed1d-4952-82a8-33a12c0ba266 /sysroot xfs rw,relatime,seclabel,attr2,inode64,logbufs=8,logbsize=32k,noquota" --squash-compressor zstd --no-hostonly-default-device --add-confdir /lib/kdump/dracut.conf.d -f /boot/initramfs-5.14.0-648.el9.x86_64kdump.img 5.14.0-648.el9.x86_64
Dec 11 00:08:56 np0005554845 cloud-init[1318]: -----END SSH HOST KEY FINGERPRINTS-----
Dec 11 00:08:56 np0005554845 cloud-init[1322]: #############################################################
Dec 11 00:08:56 np0005554845 cloud-init[1298]: Cloud-init v. 24.4-7.el9 finished at Thu, 11 Dec 2025 05:08:56 +0000. Datasource DataSourceConfigDrive [net,ver=2][source=/dev/sr0].  Up 11.63 seconds
Dec 11 00:08:57 np0005554845 systemd[1]: Finished Cloud-init: Final Stage.
Dec 11 00:08:57 np0005554845 systemd[1]: Reached target Cloud-init target.
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-networkd' will not be installed, because command 'networkctl' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-networkd' will not be installed, because command '/usr/lib/systemd/systemd-networkd-wait-online' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 11 00:08:57 np0005554845 dracut[1283]: dracut module 'biosdevname' will not be installed, because command 'biosdevname' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: memstrack is not available
Dec 11 00:08:58 np0005554845 dracut[1283]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'systemd-resolved' will not be installed, because command 'resolvectl' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'systemd-resolved' will not be installed, because command '/usr/lib/systemd/systemd-resolved' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-timesyncd' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'systemd-timesyncd' will not be installed, because command '/usr/lib/systemd/systemd-time-wait-sync' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'busybox' will not be installed, because command 'busybox' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'dbus-daemon' will not be installed, because command 'dbus-daemon' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'rngd' will not be installed, because command 'rngd' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmand' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmanctl' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'connman' will not be installed, because command 'connmand-wait-online' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'network-wicked' will not be installed, because command 'wicked' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: 62bluetooth: Could not find any command of '/usr/lib/bluetooth/bluetoothd /usr/libexec/bluetooth/bluetoothd'!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'lvmmerge' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'lvmthinpool-monitor' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'btrfs' will not be installed, because command 'btrfs' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'dmraid' will not be installed, because command 'dmraid' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'lvm' will not be installed, because command 'lvm' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'mdraid' will not be installed, because command 'mdadm' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'pcsc' will not be installed, because command 'pcscd' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'tpm2-tss' will not be installed, because command 'tpm2' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'cifs' will not be installed, because command 'mount.cifs' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsi-iname' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsiadm' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'iscsi' will not be installed, because command 'iscsid' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'nvmf' will not be installed, because command 'nvme' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: dracut module 'memstrack' will not be installed, because command 'memstrack' could not be found!
Dec 11 00:08:58 np0005554845 dracut[1283]: memstrack is not available
Dec 11 00:08:58 np0005554845 dracut[1283]: If you need to use rd.memdebug>=4, please install memstrack and procps-ng
Dec 11 00:08:58 np0005554845 dracut[1283]: *** Including module: systemd ***
Dec 11 00:08:58 np0005554845 chronyd[799]: Selected source 138.197.164.54 (2.centos.pool.ntp.org)
Dec 11 00:08:58 np0005554845 chronyd[799]: System clock TAI offset set to 37 seconds
Dec 11 00:08:58 np0005554845 dracut[1283]: *** Including module: fips ***
Dec 11 00:08:59 np0005554845 dracut[1283]: *** Including module: systemd-initrd ***
Dec 11 00:08:59 np0005554845 dracut[1283]: *** Including module: i18n ***
Dec 11 00:08:59 np0005554845 dracut[1283]: *** Including module: drm ***
Dec 11 00:08:59 np0005554845 dracut[1283]: *** Including module: prefixdevname ***
Dec 11 00:08:59 np0005554845 dracut[1283]: *** Including module: kernel-modules ***
Dec 11 00:09:00 np0005554845 kernel: block vda: the capability attribute has been deprecated.
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: kernel-modules-extra ***
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: qemu ***
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: fstab-sys ***
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: rootfs-block ***
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: terminfo ***
Dec 11 00:09:00 np0005554845 dracut[1283]: *** Including module: udev-rules ***
Dec 11 00:09:01 np0005554845 dracut[1283]: Skipping udev rule: 91-permissions.rules
Dec 11 00:09:01 np0005554845 dracut[1283]: Skipping udev rule: 80-drivers-modprobe.rules
Dec 11 00:09:01 np0005554845 dracut[1283]: *** Including module: virtiofs ***
Dec 11 00:09:01 np0005554845 dracut[1283]: *** Including module: dracut-systemd ***
Dec 11 00:09:01 np0005554845 dracut[1283]: *** Including module: usrmount ***
Dec 11 00:09:01 np0005554845 dracut[1283]: *** Including module: base ***
Dec 11 00:09:02 np0005554845 dracut[1283]: *** Including module: fs-lib ***
Dec 11 00:09:02 np0005554845 dracut[1283]: *** Including module: kdumpbase ***
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 35 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 35 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 33 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 33 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 31 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 31 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 28 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 28 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 34 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 34 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 32 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 32 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 30 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 30 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 irqbalance[785]: Cannot change IRQ 29 affinity: Operation not permitted
Dec 11 00:09:02 np0005554845 irqbalance[785]: IRQ 29 affinity is now unmanaged
Dec 11 00:09:02 np0005554845 dracut[1283]: *** Including module: microcode_ctl-fw_dir_override ***
Dec 11 00:09:02 np0005554845 dracut[1283]:  microcode_ctl module: mangling fw_dir
Dec 11 00:09:02 np0005554845 dracut[1283]:    microcode_ctl: reset fw_dir to "/lib/firmware/updates /lib/firmware"
Dec 11 00:09:02 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel"...
Dec 11 00:09:02 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel" is ignored
Dec 11 00:09:02 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-2d-07"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-2d-07" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4e-03"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-4e-03" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-4f-01"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-4f-01" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-55-04"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-55-04" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-5e-03"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-5e-03" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8c-01"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-8c-01" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-0xca"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-8e-9e-0x-0xca" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8e-9e-0x-dell"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-8e-9e-0x-dell" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: processing data directory  "/usr/share/microcode_ctl/ucode_with_caveats/intel-06-8f-08"...
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: configuration "intel-06-8f-08" is ignored
Dec 11 00:09:03 np0005554845 dracut[1283]:    microcode_ctl: final fw_dir: "/lib/firmware/updates /lib/firmware"
Dec 11 00:09:03 np0005554845 dracut[1283]: *** Including module: openssl ***
Dec 11 00:09:03 np0005554845 dracut[1283]: *** Including module: shutdown ***
Dec 11 00:09:03 np0005554845 dracut[1283]: *** Including module: squash ***
Dec 11 00:09:03 np0005554845 dracut[1283]: *** Including modules done ***
Dec 11 00:09:03 np0005554845 dracut[1283]: *** Installing kernel module dependencies ***
Dec 11 00:09:03 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:09:04 np0005554845 dracut[1283]: *** Installing kernel module dependencies done ***
Dec 11 00:09:04 np0005554845 dracut[1283]: *** Resolving executable dependencies ***
Dec 11 00:09:06 np0005554845 dracut[1283]: *** Resolving executable dependencies done ***
Dec 11 00:09:06 np0005554845 dracut[1283]: *** Generating early-microcode cpio image ***
Dec 11 00:09:06 np0005554845 dracut[1283]: *** Store current command line parameters ***
Dec 11 00:09:06 np0005554845 dracut[1283]: Stored kernel commandline:
Dec 11 00:09:06 np0005554845 dracut[1283]: No dracut internal kernel commandline stored in the initramfs
Dec 11 00:09:06 np0005554845 dracut[1283]: *** Install squash loader ***
Dec 11 00:09:07 np0005554845 dracut[1283]: *** Squashing the files inside the initramfs ***
Dec 11 00:09:08 np0005554845 dracut[1283]: *** Squashing the files inside the initramfs done ***
Dec 11 00:09:08 np0005554845 dracut[1283]: *** Creating image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' ***
Dec 11 00:09:08 np0005554845 dracut[1283]: *** Hardlinking files ***
Dec 11 00:09:08 np0005554845 dracut[1283]: *** Hardlinking files done ***
Dec 11 00:09:09 np0005554845 dracut[1283]: *** Creating initramfs image file '/boot/initramfs-5.14.0-648.el9.x86_64kdump.img' done ***
Dec 11 00:09:09 np0005554845 kdumpctl[1013]: kdump: kexec: loaded kdump kernel
Dec 11 00:09:09 np0005554845 kdumpctl[1013]: kdump: Starting kdump: [OK]
Dec 11 00:09:09 np0005554845 systemd[1]: Finished Crash recovery kernel arming.
Dec 11 00:09:09 np0005554845 systemd[1]: Startup finished in 1.642s (kernel) + 3.440s (initrd) + 19.335s (userspace) = 24.418s.
Dec 11 00:09:23 np0005554845 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 00:09:51 np0005554845 systemd[1]: Created slice User Slice of UID 1000.
Dec 11 00:09:51 np0005554845 systemd[1]: Starting User Runtime Directory /run/user/1000...
Dec 11 00:09:51 np0005554845 systemd-logind[789]: New session 1 of user zuul.
Dec 11 00:09:52 np0005554845 systemd[1]: Finished User Runtime Directory /run/user/1000.
Dec 11 00:09:52 np0005554845 systemd[1]: Starting User Manager for UID 1000...
Dec 11 00:09:52 np0005554845 systemd[4297]: Queued start job for default target Main User Target.
Dec 11 00:09:52 np0005554845 systemd[4297]: Created slice User Application Slice.
Dec 11 00:09:52 np0005554845 systemd[4297]: Started Mark boot as successful after the user session has run 2 minutes.
Dec 11 00:09:52 np0005554845 systemd[4297]: Started Daily Cleanup of User's Temporary Directories.
Dec 11 00:09:52 np0005554845 systemd[4297]: Reached target Paths.
Dec 11 00:09:52 np0005554845 systemd[4297]: Reached target Timers.
Dec 11 00:09:52 np0005554845 systemd[4297]: Starting D-Bus User Message Bus Socket...
Dec 11 00:09:52 np0005554845 systemd[4297]: Starting Create User's Volatile Files and Directories...
Dec 11 00:09:52 np0005554845 systemd[4297]: Finished Create User's Volatile Files and Directories.
Dec 11 00:09:52 np0005554845 systemd[4297]: Listening on D-Bus User Message Bus Socket.
Dec 11 00:09:52 np0005554845 systemd[4297]: Reached target Sockets.
Dec 11 00:09:52 np0005554845 systemd[4297]: Reached target Basic System.
Dec 11 00:09:52 np0005554845 systemd[4297]: Reached target Main User Target.
Dec 11 00:09:52 np0005554845 systemd[4297]: Startup finished in 152ms.
Dec 11 00:09:52 np0005554845 systemd[1]: Started User Manager for UID 1000.
Dec 11 00:09:52 np0005554845 systemd[1]: Started Session 1 of User zuul.
Dec 11 00:09:52 np0005554845 python3[4379]: ansible-setup Invoked with gather_subset=['!all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:09:57 np0005554845 python3[4407]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:10:03 np0005554845 python3[4465]: ansible-setup Invoked with gather_subset=['network'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:10:04 np0005554845 python3[4505]: ansible-zuul_console Invoked with path=/tmp/console-{log_uuid}.log port=19885 state=present
Dec 11 00:10:06 np0005554845 python3[4531]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDr3ukZnI+QpcKTXofHM5c4pf18YVuaBElb5fesYU6Udi0vF9J0az12/ys7QQxkKaFFHbNzqXv7IyRcX/SgeHZ4aIjAUqXfzOLp3Y14fC2l6uQ/qHac59jRdixxyQkezoGtujiQFarkfwoFP4CeIEIyUec/cN1wIPv0s3FlvwP/FP0ECmhCTp2Qs1QCpWyDhwcNDWdzp3qjgPAOfT8beRko8jtHZgMEzbVfdcnS6V2Vf7a+B8wGfhHe+2vmhnSR+dbsZUmwjBP6zf2bt4tyyt/TroDouyYYYBJKpTmMzH26NX3bcyqOePmuZjrqTjHf/oqnradY4XUjuKS1EtmxPq3oPha48sRGTcREKavS5G69dT7YPx3MRIM2dQ5TxizEQ1RQRJExjJIm66RCaghqBKjuGRjrKnV6vSAqLwmnt5qEpq1nHd0yNyY2dHSG01/BguSuoCmbi2BfdrTrYqsqukX70Avsd7ucnsWYiUyt7nVXUT0PJiLWxRfqKs8ccoNEQHs= zuul-build-sshkey manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:06 np0005554845 python3[4555]: ansible-file Invoked with state=directory path=/home/zuul/.ssh mode=448 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:07 np0005554845 python3[4654]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:07 np0005554845 python3[4725]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765429806.831493-253-54067493515587/source dest=/home/zuul/.ssh/id_rsa mode=384 force=False _original_basename=26090b9391024e3f8be11fe3cc2f0c0d_id_rsa follow=False checksum=d4760139f5a19c41f71ab80477d2d967c645a95c backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:08 np0005554845 python3[4848]: ansible-ansible.legacy.stat Invoked with path=/home/zuul/.ssh/id_rsa.pub follow=False get_checksum=False checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:08 np0005554845 python3[4919]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765429807.8561342-309-102379822613581/source dest=/home/zuul/.ssh/id_rsa.pub mode=420 force=False _original_basename=26090b9391024e3f8be11fe3cc2f0c0d_id_rsa.pub follow=False checksum=6c3b83eb7800faec237e999e0cdc4871a660fa86 backup=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:09 np0005554845 python3[4967]: ansible-ping Invoked with data=pong
Dec 11 00:10:11 np0005554845 python3[4991]: ansible-setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:10:13 np0005554845 python3[5049]: ansible-zuul_debug_info Invoked with ipv4_route_required=False ipv6_route_required=False image_manifest_files=['/etc/dib-builddate.txt', '/etc/image-hostname.txt'] image_manifest=None traceroute_host=None
Dec 11 00:10:15 np0005554845 python3[5081]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:15 np0005554845 python3[5105]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:15 np0005554845 python3[5129]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:16 np0005554845 python3[5153]: ansible-file Invoked with path=/home/zuul/zuul-output/logs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:16 np0005554845 python3[5177]: ansible-file Invoked with path=/home/zuul/zuul-output/artifacts state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:16 np0005554845 python3[5201]: ansible-file Invoked with path=/home/zuul/zuul-output/docs state=directory mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:18 np0005554845 python3[5227]: ansible-file Invoked with path=/etc/ci state=directory owner=root group=root mode=493 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:18 np0005554845 python3[5305]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/mirror_info.sh follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:19 np0005554845 python3[5378]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/mirror_info.sh owner=root group=root mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765429818.4057195-34-93420086843167/source follow=False _original_basename=mirror_info.sh.j2 checksum=92d92a03afdddee82732741071f662c729080c35 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:19 np0005554845 python3[5426]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAABIwAAAQEA4Z/c9osaGGtU6X8fgELwfj/yayRurfcKA0HMFfdpPxev2dbwljysMuzoVp4OZmW1gvGtyYPSNRvnzgsaabPNKNo2ym5NToCP6UM+KSe93aln4BcM/24mXChYAbXJQ5Bqq/pIzsGs/pKetQN+vwvMxLOwTvpcsCJBXaa981RKML6xj9l/UZ7IIq1HSEKMvPLxZMWdu0Ut8DkCd5F4nOw9Wgml2uYpDCj5LLCrQQ9ChdOMz8hz6SighhNlRpPkvPaet3OXxr/ytFMu7j7vv06CaEnuMMiY2aTWN1Imin9eHAylIqFHta/3gFfQSWt9jXM7owkBLKL7ATzhaAn+fjNupw== arxcruz@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:20 np0005554845 python3[5450]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDS4Fn6k4deCnIlOtLWqZJyksbepjQt04j8Ed8CGx9EKkj0fKiAxiI4TadXQYPuNHMixZy4Nevjb6aDhL5Z906TfvNHKUrjrG7G26a0k8vdc61NEQ7FmcGMWRLwwc6ReDO7lFpzYKBMk4YqfWgBuGU/K6WLKiVW2cVvwIuGIaYrE1OiiX0iVUUk7KApXlDJMXn7qjSYynfO4mF629NIp8FJal38+Kv+HA+0QkE5Y2xXnzD4Lar5+keymiCHRntPppXHeLIRzbt0gxC7v3L72hpQ3BTBEzwHpeS8KY+SX1y5lRMN45thCHfJqGmARJREDjBvWG8JXOPmVIKQtZmVcD5b mandreou@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:20 np0005554845 python3[5474]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC9MiLfy30deHA7xPOAlew5qUq3UP2gmRMYJi8PtkjFB20/DKeWwWNnkZPqP9AayruRoo51SIiVg870gbZE2jYl+Ncx/FYDe56JeC3ySZsXoAVkC9bP7gkOGqOmJjirvAgPMI7bogVz8i+66Q4Ar7OKTp3762G4IuWPPEg4ce4Y7lx9qWocZapHYq4cYKMxrOZ7SEbFSATBbe2bPZAPKTw8do/Eny+Hq/LkHFhIeyra6cqTFQYShr+zPln0Cr+ro/pDX3bB+1ubFgTpjpkkkQsLhDfR6cCdCWM2lgnS3BTtYj5Ct9/JRPR5YOphqZz+uB+OEu2IL68hmU9vNTth1KeX rlandy@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:20 np0005554845 python3[5498]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIFCbgz8gdERiJlk2IKOtkjQxEXejrio6ZYMJAVJYpOIp raukadah@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:21 np0005554845 python3[5522]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIBqb3Q/9uDf4LmihQ7xeJ9gA/STIQUFPSfyyV0m8AoQi bshewale@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:21 np0005554845 python3[5546]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQC0I8QqQx0Az2ysJt2JuffucLijhBqnsXKEIx5GyHwxVULROa8VtNFXUDH6ZKZavhiMcmfHB2+TBTda+lDP4FldYj06dGmzCY+IYGa+uDRdxHNGYjvCfLFcmLlzRK6fNbTcui+KlUFUdKe0fb9CRoGKyhlJD5GRkM1Dv+Yb6Bj+RNnmm1fVGYxzmrD2utvffYEb0SZGWxq2R9gefx1q/3wCGjeqvufEV+AskPhVGc5T7t9eyZ4qmslkLh1/nMuaIBFcr9AUACRajsvk6mXrAN1g3HlBf2gQlhi1UEyfbqIQvzzFtsbLDlSum/KmKjy818GzvWjERfQ0VkGzCd9bSLVL dviroel@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:21 np0005554845 python3[5570]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDLOQd4ZLtkZXQGY6UwAr/06ppWQK4fDO3HaqxPk98csyOCBXsliSKK39Bso828+5srIXiW7aI6aC9P5mwi4mUZlGPfJlQbfrcGvY+b/SocuvaGK+1RrHLoJCT52LBhwgrzlXio2jeksZeein8iaTrhsPrOAs7KggIL/rB9hEiB3NaOPWhhoCP4vlW6MEMExGcqB/1FVxXFBPnLkEyW0Lk7ycVflZl2ocRxbfjZi0+tI1Wlinp8PvSQSc/WVrAcDgKjc/mB4ODPOyYy3G8FHgfMsrXSDEyjBKgLKMsdCrAUcqJQWjkqXleXSYOV4q3pzL+9umK+q/e3P/bIoSFQzmJKTU1eDfuvPXmow9F5H54fii/Da7ezlMJ+wPGHJrRAkmzvMbALy7xwswLhZMkOGNtRcPqaKYRmIBKpw3o6bCTtcNUHOtOQnzwY8JzrM2eBWJBXAANYw+9/ho80JIiwhg29CFNpVBuHbql2YxJQNrnl90guN65rYNpDxdIluweyUf8= anbanerj@kaermorhen manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:22 np0005554845 python3[5594]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQC3VwV8Im9kRm49lt3tM36hj4Zv27FxGo4C1Q/0jqhzFmHY7RHbmeRr8ObhwWoHjXSozKWg8FL5ER0z3hTwL0W6lez3sL7hUaCmSuZmG5Hnl3x4vTSxDI9JZ/Y65rtYiiWQo2fC5xJhU/4+0e5e/pseCm8cKRSu+SaxhO+sd6FDojA2x1BzOzKiQRDy/1zWGp/cZkxcEuB1wHI5LMzN03c67vmbu+fhZRAUO4dQkvcnj2LrhQtpa+ytvnSjr8icMDosf1OsbSffwZFyHB/hfWGAfe0eIeSA2XPraxiPknXxiPKx2MJsaUTYbsZcm3EjFdHBBMumw5rBI74zLrMRvCO9GwBEmGT4rFng1nP+yw5DB8sn2zqpOsPg1LYRwCPOUveC13P6pgsZZPh812e8v5EKnETct+5XI3dVpdw6CnNiLwAyVAF15DJvBGT/u1k0Myg/bQn+Gv9k2MSj6LvQmf6WbZu2Wgjm30z3FyCneBqTL7mLF19YXzeC0ufHz5pnO1E= dasm@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:22 np0005554845 python3[5618]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIHUnwjB20UKmsSed9X73eGNV5AOEFccQ3NYrRW776pEk cjeanner manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:22 np0005554845 python3[5642]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDercCMGn8rW1C4P67tHgtflPdTeXlpyUJYH+6XDd2lR jgilaber@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:22 np0005554845 python3[5666]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIAMI6kkg9Wg0sG7jIJmyZemEBwUn1yzNpQQd3gnulOmZ adrianfuscoarnejo@gmail.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:23 np0005554845 python3[5690]: ansible-authorized_key Invoked with user=zuul state=present key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBPijwpQu/3jhhhBZInXNOLEH57DrknPc3PLbsRvYyJIFzwYjX+WD4a7+nGnMYS42MuZk6TJcVqgnqofVx4isoD4= ramishra@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:23 np0005554845 python3[5714]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIGpU/BepK3qX0NRf5Np+dOBDqzQEefhNrw2DCZaH3uWW rebtoor@monolith manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:23 np0005554845 python3[5738]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDK0iKdi8jQTpQrDdLVH/AAgLVYyTXF7AQ1gjc/5uT3t ykarel@yatinkarel manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:24 np0005554845 python3[5762]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIF/V/cLotA6LZeO32VL45Hd78skuA2lJA425Sm2LlQeZ fmount@horcrux manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:24 np0005554845 python3[5786]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIDa7QCjuDMVmRPo1rREbGwzYeBCYVN+Ou/3WKXZEC6Sr manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:24 np0005554845 python3[5810]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQCfNtF7NvKl915TGsGGoseUb06Hj8L/S4toWf0hExeY+F00woL6NvBlJD0nDct+P5a22I4EhvoQCRQ8reaPCm1lybR3uiRIJsj+8zkVvLwby9LXzfZorlNG9ofjd00FEmB09uW/YvTl6Q9XwwwX6tInzIOv3TMqTHHGOL74ibbj8J/FJR0cFEyj0z4WQRvtkh32xAHl83gbuINryMt0sqRI+clj2381NKL55DRLQrVw0gsfqqxiHAnXg21qWmc4J+b9e9kiuAFQjcjwTVkwJCcg3xbPwC/qokYRby/Y5S40UUd7/jEARGXT7RZgpzTuDd1oZiCVrnrqJNPaMNdVv5MLeFdf1B7iIe5aa/fGouX7AO4SdKhZUdnJmCFAGvjC6S3JMZ2wAcUl+OHnssfmdj7XL50cLo27vjuzMtLAgSqi6N99m92WCF2s8J9aVzszX7Xz9OKZCeGsiVJp3/NdABKzSEAyM9xBD/5Vho894Sav+otpySHe3p6RUTgbB5Zu8VyZRZ/UtB3ueXxyo764yrc6qWIDqrehm84Xm9g+/jpIBzGPl07NUNJpdt/6Sgf9RIKXw/7XypO5yZfUcuFNGTxLfqjTNrtgLZNcjfav6sSdVXVcMPL//XNuRdKmVFaO76eV/oGMQGr1fGcCD+N+CpI7+Q+fCNB6VFWG4nZFuI/Iuw== averdagu@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:25 np0005554845 python3[5834]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDq8l27xI+QlQVdS4djp9ogSoyrNE2+Ox6vKPdhSNL1J3PE5w+WCSvMz9A5gnNuH810zwbekEApbxTze/gLQJwBHA52CChfURpXrFaxY7ePXRElwKAL3mJfzBWY/c5jnNL9TCVmFJTGZkFZP3Nh+BMgZvL6xBkt3WKm6Uq18qzd9XeKcZusrA+O+uLv1fVeQnadY9RIqOCyeFYCzLWrUfTyE8x/XG0hAWIM7qpnF2cALQS2h9n4hW5ybiUN790H08wf9hFwEf5nxY9Z9dVkPFQiTSGKNBzmnCXU9skxS/xhpFjJ5duGSZdtAHe9O+nGZm9c67hxgtf8e5PDuqAdXEv2cf6e3VBAt+Bz8EKI3yosTj0oZHfwr42Yzb1l/SKy14Rggsrc9KAQlrGXan6+u2jcQqqx7l+SWmnpFiWTV9u5cWj2IgOhApOitmRBPYqk9rE2usfO0hLn/Pj/R/Nau4803e1/EikdLE7Ps95s9mX5jRDjAoUa2JwFF5RsVFyL910= ashigupt@ashigupt.remote.csb manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:25 np0005554845 python3[5858]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIOKLl0NYKwoZ/JY5KeZU8VwRAggeOxqQJeoqp3dsAaY9 manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:25 np0005554845 python3[5882]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIASASQOH2BcOyLKuuDOdWZlPi2orcjcA8q4400T73DLH evallesp@fedora manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:25 np0005554845 python3[5906]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILeBWlamUph+jRKV2qrx1PGU7vWuGIt5+z9k96I8WehW amsinha@amsinha-mac manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:26 np0005554845 python3[5930]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIANvVgvJBlK3gb1yz5uef/JqIGq4HLEmY2dYA8e37swb morenod@redhat-laptop manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:26 np0005554845 python3[5954]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAACAQDZdI7t1cxYx65heVI24HTV4F7oQLW1zyfxHreL2TIJKxjyrUUKIFEUmTutcBlJRLNT2Eoix6x1sOw9YrchloCLcn//SGfTElr9mSc5jbjb7QXEU+zJMhtxyEJ1Po3CUGnj7ckiIXw7wcawZtrEOAQ9pH3ExYCJcEMiyNjRQZCxT3tPK+S4B95EWh5Fsrz9CkwpjNRPPH7LigCeQTM3Wc7r97utAslBUUvYceDSLA7rMgkitJE38b7rZBeYzsGQ8YYUBjTCtehqQXxCRjizbHWaaZkBU+N3zkKB6n/iCNGIO690NK7A/qb6msTijiz1PeuM8ThOsi9qXnbX5v0PoTpcFSojV7NHAQ71f0XXuS43FhZctT+Dcx44dT8Fb5vJu2cJGrk+qF8ZgJYNpRS7gPg0EG2EqjK7JMf9ULdjSu0r+KlqIAyLvtzT4eOnQipoKlb/WG5D/0ohKv7OMQ352ggfkBFIQsRXyyTCT98Ft9juqPuahi3CAQmP4H9dyE+7+Kz437PEtsxLmfm6naNmWi7Ee1DqWPwS8rEajsm4sNM4wW9gdBboJQtc0uZw0DfLj1I9r3Mc8Ol0jYtz0yNQDSzVLrGCaJlC311trU70tZ+ZkAVV6Mn8lOhSbj1cK0lvSr6ZK4dgqGl3I1eTZJJhbLNdg7UOVaiRx9543+C/p/As7w== brjackma@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:26 np0005554845 python3[5978]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIKwedoZ0TWPJX/z/4TAbO/kKcDZOQVgRH0hAqrL5UCI1 vcastell@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:27 np0005554845 python3[6002]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAIEmv8sE8GCk6ZTPIqF0FQrttBdL3mq7rCm/IJy0xDFh7 michburk@redhat.com manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:27 np0005554845 python3[6026]: ansible-authorized_key Invoked with user=zuul state=present key=ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICy6GpGEtwevXEEn4mmLR5lmSLe23dGgAvzkB9DMNbkf rsafrono@rsafrono manage_dir=True exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:10:29 np0005554845 python3[6052]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 11 00:10:29 np0005554845 systemd[1]: Starting Time & Date Service...
Dec 11 00:10:29 np0005554845 systemd[1]: Started Time & Date Service.
Dec 11 00:10:30 np0005554845 systemd-timedated[6054]: Changed time zone to 'UTC' (UTC).
Dec 11 00:10:30 np0005554845 python3[6083]: ansible-file Invoked with path=/etc/nodepool state=directory mode=511 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:30 np0005554845 python3[6159]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:31 np0005554845 python3[6230]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes src=/home/zuul/.ansible/tmp/ansible-tmp-1765429830.7027156-254-242894805418987/source _original_basename=tmp6znh8xei follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:31 np0005554845 python3[6330]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/sub_nodes_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:32 np0005554845 python3[6401]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/sub_nodes_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765429831.6027112-305-159752743532357/source _original_basename=tmp6pbkrjgy follow=False checksum=da39a3ee5e6b4b0d3255bfef95601890afd80709 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:33 np0005554845 python3[6503]: ansible-ansible.legacy.stat Invoked with path=/etc/nodepool/node_private follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:33 np0005554845 python3[6576]: ansible-ansible.legacy.copy Invoked with dest=/etc/nodepool/node_private src=/home/zuul/.ansible/tmp/ansible-tmp-1765429832.8503664-384-159136536891586/source _original_basename=tmpyxixob74 follow=False checksum=aad8ab19ba5ef1801e0e8aebf96af2ca109a6077 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:34 np0005554845 python3[6624]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa /etc/nodepool/id_rsa zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:10:34 np0005554845 python3[6650]: ansible-ansible.legacy.command Invoked with _raw_params=cp .ssh/id_rsa.pub /etc/nodepool/id_rsa.pub zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:10:34 np0005554845 python3[6730]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/zuul-sudo-grep follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:10:35 np0005554845 python3[6803]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/zuul-sudo-grep mode=288 src=/home/zuul/.ansible/tmp/ansible-tmp-1765429834.655383-455-271076553394798/source _original_basename=tmpecewetxv follow=False checksum=bdca1a77493d00fb51567671791f4aa30f66c2f0 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:36 np0005554845 python3[6854]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/visudo -c zuul_log_id=fa163ec2-ffbe-7c02-d812-00000000001f-1-compute2 zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:10:36 np0005554845 python3[6882]: ansible-ansible.legacy.command Invoked with executable=/bin/bash _raw_params=env#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-7c02-d812-000000000020-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None creates=None removes=None stdin=None
Dec 11 00:10:38 np0005554845 python3[6910]: ansible-file Invoked with path=/home/zuul/workspace state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:10:56 np0005554845 python3[6936]: ansible-ansible.builtin.file Invoked with path=/etc/ci/env state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:11:00 np0005554845 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 00:11:56 np0005554845 systemd-logind[789]: Session 1 logged out. Waiting for processes to exit.
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: [1af4:1000] type 00 class 0x020000 conventional PCI endpoint
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 0 [io  0x0000-0x003f]
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 1 [mem 0x00000000-0x00000fff]
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 4 [mem 0x00000000-0x00003fff 64bit pref]
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: ROM [mem 0x00000000-0x0007ffff pref]
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: ROM [mem 0xc0000000-0xc007ffff pref]: assigned
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 4 [mem 0x240000000-0x240003fff 64bit pref]: assigned
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 1 [mem 0xc0080000-0xc0080fff]: assigned
Dec 11 00:12:19 np0005554845 kernel: pci 0000:00:07.0: BAR 0 [io  0x1000-0x103f]: assigned
Dec 11 00:12:19 np0005554845 kernel: virtio-pci 0000:00:07.0: enabling device (0000 -> 0003)
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.5918] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 00:12:19 np0005554845 systemd-udevd[6939]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6110] device (eth1): state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6140] settings: (eth1): created default wired connection 'Wired connection 1'
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6145] device (eth1): carrier: link connected
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6147] device (eth1): state change: unavailable -> disconnected (reason 'carrier-changed', managed-type: 'full')
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6152] policy: auto-activating connection 'Wired connection 1' (d75b84f7-0cdf-34c5-b61b-91cf7fbf3605)
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6156] device (eth1): Activation: starting connection 'Wired connection 1' (d75b84f7-0cdf-34c5-b61b-91cf7fbf3605)
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6157] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6159] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6164] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:12:19 np0005554845 NetworkManager[856]: <info>  [1765429939.6168] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:12:19 np0005554845 systemd[4297]: Starting Mark boot as successful...
Dec 11 00:12:19 np0005554845 systemd[4297]: Finished Mark boot as successful.
Dec 11 00:12:20 np0005554845 systemd-logind[789]: New session 3 of user zuul.
Dec 11 00:12:20 np0005554845 systemd[1]: Started Session 3 of User zuul.
Dec 11 00:12:20 np0005554845 python3[6971]: ansible-ansible.legacy.command Invoked with _raw_params=ip -j link zuul_log_id=fa163ec2-ffbe-e0ad-9636-0000000001ea-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:12:27 np0005554845 python3[7051]: ansible-ansible.legacy.stat Invoked with path=/etc/NetworkManager/system-connections/ci-private-network.nmconnection follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:12:28 np0005554845 python3[7124]: ansible-ansible.legacy.copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765429947.567887-206-106920246216020/source dest=/etc/NetworkManager/system-connections/ci-private-network.nmconnection mode=0600 owner=root group=root follow=False _original_basename=bootstrap-ci-network-nm-connection.nmconnection.j2 checksum=ae5419e91aefcb8b1a70b3923818455626703b77 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:12:28 np0005554845 python3[7174]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:12:28 np0005554845 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 11 00:12:28 np0005554845 systemd[1]: Stopped Network Manager Wait Online.
Dec 11 00:12:28 np0005554845 systemd[1]: Stopping Network Manager Wait Online...
Dec 11 00:12:28 np0005554845 systemd[1]: Stopping Network Manager...
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.7967] caught SIGTERM, shutting down normally.
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.7979] dhcp4 (eth0): canceled DHCP transaction
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.7979] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.7980] dhcp4 (eth0): state changed no lease
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.7983] manager: NetworkManager state is now CONNECTING
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.8145] dhcp4 (eth1): canceled DHCP transaction
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.8146] dhcp4 (eth1): state changed no lease
Dec 11 00:12:28 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:12:28 np0005554845 NetworkManager[856]: <info>  [1765429948.8223] exiting (success)
Dec 11 00:12:28 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:12:28 np0005554845 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 11 00:12:28 np0005554845 systemd[1]: Stopped Network Manager.
Dec 11 00:12:28 np0005554845 systemd[1]: NetworkManager.service: Consumed 1.784s CPU time, 10.1M memory peak.
Dec 11 00:12:28 np0005554845 systemd[1]: Starting Network Manager...
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.8848] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:6fd4de4e-7b3d-47fa-92ea-8323052e6a02)
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.8852] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.8915] manager[0x556646fee000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 00:12:28 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 00:12:28 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.9986] hostname: hostname: using hostnamed
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.9987] hostname: static hostname changed from (none) to "np0005554845.novalocal"
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.9993] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.9999] manager[0x556646fee000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 00:12:28 np0005554845 NetworkManager[7186]: <info>  [1765429948.9999] manager[0x556646fee000]: rfkill: WWAN hardware radio set enabled
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0032] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0032] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0033] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0034] manager: Networking is enabled by state file
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0036] settings: Loaded settings plugin: keyfile (internal)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0040] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0066] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0077] dhcp: init: Using DHCP client 'internal'
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0079] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0084] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0089] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0096] device (lo): Activation: starting connection 'lo' (98463273-a93b-4216-b1e6-11bf6b9079be)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0102] device (eth0): carrier: link connected
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0107] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0111] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0111] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0117] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0122] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0127] device (eth1): carrier: link connected
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0131] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0136] manager: (eth1): assume: will attempt to assume matching connection 'Wired connection 1' (d75b84f7-0cdf-34c5-b61b-91cf7fbf3605) (indicated)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0137] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0141] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0147] device (eth1): Activation: starting connection 'Wired connection 1' (d75b84f7-0cdf-34c5-b61b-91cf7fbf3605)
Dec 11 00:12:29 np0005554845 systemd[1]: Started Network Manager.
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0154] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0158] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0160] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0161] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0163] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0166] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0168] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0170] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0172] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0178] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0181] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0189] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0191] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0210] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0211] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0215] device (lo): Activation: successful, device activated.
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0245] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0250] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 00:12:29 np0005554845 systemd[1]: Starting Network Manager Wait Online...
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0308] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0335] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0337] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0340] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0343] device (eth0): Activation: successful, device activated.
Dec 11 00:12:29 np0005554845 NetworkManager[7186]: <info>  [1765429949.0347] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 00:12:29 np0005554845 python3[7258]: ansible-ansible.legacy.command Invoked with _raw_params=ip route zuul_log_id=fa163ec2-ffbe-e0ad-9636-0000000000d3-0-controller zuul_ansible_split_streams=False _uses_shell=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:12:39 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:12:59 np0005554845 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3002] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 00:13:14 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:13:14 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3320] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3325] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3336] device (eth1): Activation: successful, device activated.
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3347] manager: startup complete
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3349] device (eth1): state change: activated -> failed (reason 'ip-config-unavailable', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <warn>  [1765429994.3358] device (eth1): Activation: failed for connection 'Wired connection 1'
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3370] device (eth1): state change: failed -> disconnected (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 systemd[1]: Finished Network Manager Wait Online.
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3595] dhcp4 (eth1): canceled DHCP transaction
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3597] dhcp4 (eth1): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3599] dhcp4 (eth1): state changed no lease
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3628] policy: auto-activating connection 'ci-private-network' (b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856)
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3639] device (eth1): Activation: starting connection 'ci-private-network' (b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856)
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3642] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3650] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3664] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3680] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3746] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3751] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:13:14 np0005554845 NetworkManager[7186]: <info>  [1765429994.3759] device (eth1): Activation: successful, device activated.
Dec 11 00:13:24 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:13:29 np0005554845 systemd[1]: session-3.scope: Deactivated successfully.
Dec 11 00:13:29 np0005554845 systemd[1]: session-3.scope: Consumed 1.671s CPU time.
Dec 11 00:13:29 np0005554845 systemd-logind[789]: Session 3 logged out. Waiting for processes to exit.
Dec 11 00:13:29 np0005554845 systemd-logind[789]: Removed session 3.
Dec 11 00:13:36 np0005554845 systemd-logind[789]: New session 4 of user zuul.
Dec 11 00:13:36 np0005554845 systemd[1]: Started Session 4 of User zuul.
Dec 11 00:13:37 np0005554845 python3[7367]: ansible-ansible.legacy.stat Invoked with path=/etc/ci/env/networking-info.yml follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:13:37 np0005554845 python3[7440]: ansible-ansible.legacy.copy Invoked with dest=/etc/ci/env/networking-info.yml owner=root group=root mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765430017.071459-365-252888889326430/source _original_basename=tmplkwcvc2t follow=False checksum=ca25173b24391e7ad7df87366d8f51bbee1c8179 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:13:40 np0005554845 systemd[1]: session-4.scope: Deactivated successfully.
Dec 11 00:13:40 np0005554845 systemd-logind[789]: Session 4 logged out. Waiting for processes to exit.
Dec 11 00:13:40 np0005554845 systemd-logind[789]: Removed session 4.
Dec 11 00:15:45 np0005554845 systemd[4297]: Created slice User Background Tasks Slice.
Dec 11 00:15:45 np0005554845 systemd[4297]: Starting Cleanup of User's Temporary Files and Directories...
Dec 11 00:15:45 np0005554845 systemd[4297]: Finished Cleanup of User's Temporary Files and Directories.
Dec 11 00:18:48 np0005554845 systemd-logind[789]: New session 5 of user zuul.
Dec 11 00:18:48 np0005554845 systemd[1]: Started Session 5 of User zuul.
Dec 11 00:18:48 np0005554845 python3[7500]: ansible-ansible.legacy.command Invoked with _raw_params=lsblk -nd -o MAJ:MIN /dev/vda#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-6f38-515b-000000001f03-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:49 np0005554845 python3[7529]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/init.scope state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:49 np0005554845 python3[7555]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/machine.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:49 np0005554845 python3[7581]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/system.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:50 np0005554845 python3[7607]: ansible-ansible.builtin.file Invoked with path=/sys/fs/cgroup/user.slice state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:50 np0005554845 python3[7633]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system.conf.d state=directory mode=0755 recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:51 np0005554845 python3[7711]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system.conf.d/override.conf follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:18:51 np0005554845 python3[7784]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system.conf.d/override.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765430330.9867494-519-6232831813415/source _original_basename=tmpcom0lipo follow=False checksum=a05098bd3d2321238ea1169d0e6f135b35b392d4 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:18:52 np0005554845 python3[7834]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:18:52 np0005554845 systemd[1]: Reloading.
Dec 11 00:18:52 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:18:54 np0005554845 python3[7890]: ansible-ansible.builtin.wait_for Invoked with path=/sys/fs/cgroup/system.slice/io.max state=present timeout=30 host=127.0.0.1 connect_timeout=5 delay=0 active_connection_states=['ESTABLISHED', 'FIN_WAIT1', 'FIN_WAIT2', 'SYN_RECV', 'SYN_SENT', 'TIME_WAIT'] sleep=1 port=None search_regex=None exclude_hosts=None msg=None
Dec 11 00:18:54 np0005554845 python3[7916]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/init.scope/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:55 np0005554845 python3[7944]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/machine.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:55 np0005554845 python3[7972]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/system.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:55 np0005554845 python3[8000]: ansible-ansible.legacy.command Invoked with _raw_params=echo "252:0   riops=18000 wiops=18000 rbps=262144000 wbps=262144000" > /sys/fs/cgroup/user.slice/io.max#012 _uses_shell=True zuul_log_id=in-loop-ignore zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:56 np0005554845 python3[8027]: ansible-ansible.legacy.command Invoked with _raw_params=echo "init";    cat /sys/fs/cgroup/init.scope/io.max; echo "machine"; cat /sys/fs/cgroup/machine.slice/io.max; echo "system";  cat /sys/fs/cgroup/system.slice/io.max; echo "user";    cat /sys/fs/cgroup/user.slice/io.max;#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-6f38-515b-000000001f0a-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:18:57 np0005554845 python3[8057]: ansible-ansible.builtin.stat Invoked with path=/sys/fs/cgroup/kubepods.slice/io.max follow=False get_md5=False get_checksum=True get_mime=True get_attributes=True checksum_algorithm=sha1
Dec 11 00:19:00 np0005554845 systemd-logind[789]: Session 5 logged out. Waiting for processes to exit.
Dec 11 00:19:00 np0005554845 systemd[1]: session-5.scope: Deactivated successfully.
Dec 11 00:19:00 np0005554845 systemd[1]: session-5.scope: Consumed 4.495s CPU time.
Dec 11 00:19:00 np0005554845 systemd-logind[789]: Removed session 5.
Dec 11 00:19:02 np0005554845 systemd-logind[789]: New session 6 of user zuul.
Dec 11 00:19:02 np0005554845 systemd[1]: Started Session 6 of User zuul.
Dec 11 00:19:02 np0005554845 python3[8090]: ansible-ansible.legacy.dnf Invoked with name=['podman', 'buildah'] state=present allow_downgrade=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_repoquery=True install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 allowerasing=False nobest=False use_backend=auto conf_file=None disable_excludes=None download_dir=None list=None releasever=None
Dec 11 00:19:15 np0005554845 kernel: SELinux:  Converting 385 SID table entries...
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:19:15 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  Converting 385 SID table entries...
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:19:23 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  Converting 385 SID table entries...
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:19:32 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:19:33 np0005554845 setsebool[8154]: The virt_use_nfs policy boolean was changed to 1 by root
Dec 11 00:19:33 np0005554845 setsebool[8154]: The virt_sandbox_use_all_caps policy boolean was changed to 1 by root
Dec 11 00:19:44 np0005554845 kernel: SELinux:  Converting 388 SID table entries...
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:19:44 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:20:03 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 11 00:20:03 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:20:03 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:20:03 np0005554845 systemd[1]: Reloading.
Dec 11 00:20:03 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:20:03 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:20:12 np0005554845 python3[14132]: ansible-ansible.legacy.command Invoked with _raw_params=echo "openstack-k8s-operators+cirobot"#012 _uses_shell=True zuul_log_id=fa163ec2-ffbe-8c32-a5ee-00000000000c-1-compute2 zuul_ansible_split_streams=False warn=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:20:13 np0005554845 kernel: evm: overlay not supported
Dec 11 00:20:13 np0005554845 systemd[4297]: Starting D-Bus User Message Bus...
Dec 11 00:20:13 np0005554845 dbus-broker-launch[14589]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +31: Eavesdropping is deprecated and ignored
Dec 11 00:20:13 np0005554845 dbus-broker-launch[14589]: Policy to allow eavesdropping in /usr/share/dbus-1/session.conf +33: Eavesdropping is deprecated and ignored
Dec 11 00:20:13 np0005554845 systemd[4297]: Started D-Bus User Message Bus.
Dec 11 00:20:13 np0005554845 dbus-broker-lau[14589]: Ready
Dec 11 00:20:13 np0005554845 systemd[4297]: selinux: avc:  op=load_policy lsm=selinux seqno=6 res=1
Dec 11 00:20:13 np0005554845 systemd[4297]: Created slice Slice /user.
Dec 11 00:20:13 np0005554845 systemd[4297]: podman-14509.scope: unit configures an IP firewall, but not running as root.
Dec 11 00:20:13 np0005554845 systemd[4297]: (This warning is only shown for the first unit using IP firewalling.)
Dec 11 00:20:13 np0005554845 systemd[4297]: Started podman-14509.scope.
Dec 11 00:20:13 np0005554845 systemd[4297]: Started podman-pause-1d9b53ac.scope.
Dec 11 00:20:14 np0005554845 python3[14911]: ansible-ansible.builtin.blockinfile Invoked with state=present insertafter=EOF dest=/etc/containers/registries.conf content=[[registry]]#012location = "38.102.83.115:5001"#012insecure = true path=/etc/containers/registries.conf block=[[registry]]#012location = "38.102.83.115:5001"#012insecure = true marker=# {mark} ANSIBLE MANAGED BLOCK create=False backup=False marker_begin=BEGIN marker_end=END unsafe_writes=False insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:20:14 np0005554845 python3[14911]: ansible-ansible.builtin.blockinfile [WARNING] Module remote_tmp /root/.ansible/tmp did not exist and was created with a mode of 0700, this may cause issues when running as another user. To avoid this, create the remote_tmp dir with the correct permissions manually
Dec 11 00:20:14 np0005554845 systemd[1]: session-6.scope: Deactivated successfully.
Dec 11 00:20:14 np0005554845 systemd[1]: session-6.scope: Consumed 59.654s CPU time.
Dec 11 00:20:14 np0005554845 systemd-logind[789]: Session 6 logged out. Waiting for processes to exit.
Dec 11 00:20:14 np0005554845 systemd-logind[789]: Removed session 6.
Dec 11 00:20:38 np0005554845 systemd-logind[789]: New session 7 of user zuul.
Dec 11 00:20:38 np0005554845 systemd[1]: Started Session 7 of User zuul.
Dec 11 00:20:38 np0005554845 python3[23095]: ansible-ansible.posix.authorized_key Invoked with user=zuul key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBgab5Fr1TGjX/h89Y+ZLzPPd5jPPH04g/U9CZyWCuejMDN/LFQpNbmA865FEB9d/g79oA+3woj4Pwqd2YfvM7g= zuul@np0005554842.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:20:38 np0005554845 python3[23266]: ansible-ansible.posix.authorized_key Invoked with user=root key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBgab5Fr1TGjX/h89Y+ZLzPPd5jPPH04g/U9CZyWCuejMDN/LFQpNbmA865FEB9d/g79oA+3woj4Pwqd2YfvM7g= zuul@np0005554842.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:20:39 np0005554845 python3[23563]: ansible-ansible.builtin.user Invoked with name=cloud-admin shell=/bin/bash state=present non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on np0005554845.novalocal update_password=always uid=None group=None groups=None comment=None home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None
Dec 11 00:20:40 np0005554845 python3[23775]: ansible-ansible.posix.authorized_key Invoked with user=cloud-admin key=ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBBgab5Fr1TGjX/h89Y+ZLzPPd5jPPH04g/U9CZyWCuejMDN/LFQpNbmA865FEB9d/g79oA+3woj4Pwqd2YfvM7g= zuul@np0005554842.novalocal#012 manage_dir=True state=present exclusive=False validate_certs=True follow=False path=None key_options=None comment=None
Dec 11 00:20:40 np0005554845 python3[24015]: ansible-ansible.legacy.stat Invoked with path=/etc/sudoers.d/cloud-admin follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:20:41 np0005554845 python3[24233]: ansible-ansible.legacy.copy Invoked with dest=/etc/sudoers.d/cloud-admin mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765430440.341077-169-251680086739393/source _original_basename=tmpceb5dcbe follow=False checksum=e7614e5ad3ab06eaae55b8efaa2ed81b63ea5634 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:20:41 np0005554845 python3[24557]: ansible-ansible.builtin.hostname Invoked with name=compute-2 use=systemd
Dec 11 00:20:42 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 00:20:42 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 00:20:42 np0005554845 systemd-hostnamed[24658]: Changed pretty hostname to 'compute-2'
Dec 11 00:20:42 np0005554845 systemd-hostnamed[24658]: Hostname set to <compute-2> (static)
Dec 11 00:20:42 np0005554845 NetworkManager[7186]: <info>  [1765430442.1678] hostname: static hostname changed from "np0005554845.novalocal" to "compute-2"
Dec 11 00:20:42 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:20:42 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:20:42 np0005554845 systemd[1]: session-7.scope: Deactivated successfully.
Dec 11 00:20:42 np0005554845 systemd[1]: session-7.scope: Consumed 2.481s CPU time.
Dec 11 00:20:42 np0005554845 systemd-logind[789]: Session 7 logged out. Waiting for processes to exit.
Dec 11 00:20:42 np0005554845 systemd-logind[789]: Removed session 7.
Dec 11 00:20:52 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:20:57 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:20:57 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:20:57 np0005554845 systemd[1]: man-db-cache-update.service: Consumed 1min 6.883s CPU time.
Dec 11 00:20:58 np0005554845 systemd[1]: run-re1583212c7274673a7953a81a34ae0dd.service: Deactivated successfully.
Dec 11 00:21:12 np0005554845 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 00:24:28 np0005554845 systemd[1]: Starting Cleanup of Temporary Directories...
Dec 11 00:24:28 np0005554845 systemd-logind[789]: New session 8 of user zuul.
Dec 11 00:24:28 np0005554845 systemd[1]: Started Session 8 of User zuul.
Dec 11 00:24:28 np0005554845 systemd[1]: systemd-tmpfiles-clean.service: Deactivated successfully.
Dec 11 00:24:28 np0005554845 systemd[1]: Finished Cleanup of Temporary Directories.
Dec 11 00:24:28 np0005554845 systemd[1]: run-credentials-systemd\x2dtmpfiles\x2dclean.service.mount: Deactivated successfully.
Dec 11 00:24:28 np0005554845 python3[29993]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:24:30 np0005554845 python3[30109]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:32 np0005554845 python3[30182]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=delorean.repo follow=False checksum=0f7c85cc67bf467c48edf98d5acc63e62d808324 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:32 np0005554845 python3[30208]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean-antelope-testing.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:32 np0005554845 python3[30281]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=delorean-antelope-testing.repo follow=False checksum=4ebc56dead962b5d40b8d420dad43b948b84d3fc backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:32 np0005554845 python3[30307]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-highavailability.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:33 np0005554845 python3[30380]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=repo-setup-centos-highavailability.repo follow=False checksum=55d0f695fd0d8f47cbc3044ce0dcf5f88862490f backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:33 np0005554845 python3[30406]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-powertools.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:34 np0005554845 python3[30479]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=repo-setup-centos-powertools.repo follow=False checksum=4b0cf99aa89c5c5be0151545863a7a7568f67568 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:34 np0005554845 python3[30505]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-appstream.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:34 np0005554845 python3[30578]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=repo-setup-centos-appstream.repo follow=False checksum=e89244d2503b2996429dda1857290c1e91e393a1 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:34 np0005554845 python3[30604]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/repo-setup-centos-baseos.repo follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:35 np0005554845 python3[30677]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=repo-setup-centos-baseos.repo follow=False checksum=36d926db23a40dbfa5c84b5e4d43eac6fa2301d6 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:35 np0005554845 python3[30703]: ansible-ansible.legacy.stat Invoked with path=/etc/yum.repos.d/delorean.repo.md5 follow=False get_checksum=True checksum_algorithm=sha1 get_md5=False get_mime=True get_attributes=True
Dec 11 00:24:36 np0005554845 python3[30776]: ansible-ansible.legacy.copy Invoked with dest=/etc/yum.repos.d/ src=/home/zuul/.ansible/tmp/ansible-tmp-1765430670.4924688-33946-167845025829764/source mode=0755 _original_basename=delorean.repo.md5 follow=False checksum=2583a70b3ee76a9837350b0837bc004a8e52405c backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None remote_src=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:24:48 np0005554845 python3[30824]: ansible-ansible.legacy.command Invoked with _raw_params=hostname _uses_shell=False stdin_add_newline=True strip_empty_ends=True argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:29:47 np0005554845 systemd[1]: session-8.scope: Deactivated successfully.
Dec 11 00:29:47 np0005554845 systemd[1]: session-8.scope: Consumed 5.499s CPU time.
Dec 11 00:29:47 np0005554845 systemd-logind[789]: Session 8 logged out. Waiting for processes to exit.
Dec 11 00:29:47 np0005554845 systemd-logind[789]: Removed session 8.
Dec 11 00:35:45 np0005554845 systemd[1]: Starting dnf makecache...
Dec 11 00:35:45 np0005554845 dnf[30834]: Failed determining last makecache time.
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-barbican-42b4c41831408a8e323 377 kB/s |  13 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-python-glean-10df0bd91b9bc5c9fd9cc02d7 3.3 MB/s |  65 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-cinder-1c00d6490d88e436f26ef 1.7 MB/s |  32 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-python-stevedore-c4acc5639fd2329372142 6.0 MB/s | 131 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-python-cloudkitty-tests-tempest-2c80f8 1.5 MB/s |  32 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-os-refresh-config-9bfc52b5049be2d8de61  14 MB/s | 349 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-nova-6f8decf0b4f1aa2e96292b6 2.1 MB/s |  42 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-python-designate-tests-tempest-347fdbc 756 kB/s |  18 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-glance-1fd12c29b339f30fe823e 927 kB/s |  18 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-keystone-e4b40af0ae3698fbbbb 1.3 MB/s |  29 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-manila-3c01b7181572c95dac462 1.4 MB/s |  25 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-python-whitebox-neutron-tests-tempest- 5.4 MB/s | 154 kB     00:00
Dec 11 00:35:45 np0005554845 dnf[30834]: delorean-openstack-octavia-ba397f07a7331190208c 207 kB/s |  26 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-openstack-watcher-c014f81a8647287f6dcc 590 kB/s |  16 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-ansible-config_template-5ccaa22121a7ff 379 kB/s | 7.4 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-puppet-ceph-7352068d7b8c84ded636ab3158 6.1 MB/s | 144 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-openstack-swift-dc98a8463506ac520c469a 758 kB/s |  14 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-python-tempestconf-8515371b7cceebd4282 2.3 MB/s |  53 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: delorean-openstack-heat-ui-013accbfd179753bc3f0 3.9 MB/s |  96 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: CentOS Stream 9 - BaseOS                         64 kB/s | 7.0 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: CentOS Stream 9 - AppStream                      67 kB/s | 7.4 kB     00:00
Dec 11 00:35:46 np0005554845 dnf[30834]: CentOS Stream 9 - CRB                            71 kB/s | 6.9 kB     00:00
Dec 11 00:35:47 np0005554845 dnf[30834]: CentOS Stream 9 - Extras packages                27 kB/s | 8.3 kB     00:00
Dec 11 00:35:47 np0005554845 dnf[30834]: dlrn-antelope-testing                            29 MB/s | 1.1 MB     00:00
Dec 11 00:35:47 np0005554845 dnf[30834]: dlrn-antelope-build-deps                         16 MB/s | 461 kB     00:00
Dec 11 00:35:47 np0005554845 dnf[30834]: centos9-rabbitmq                                7.6 MB/s | 123 kB     00:00
Dec 11 00:35:47 np0005554845 dnf[30834]: centos9-storage                                  26 MB/s | 415 kB     00:00
Dec 11 00:35:48 np0005554845 dnf[30834]: centos9-opstools                                1.5 MB/s |  51 kB     00:00
Dec 11 00:35:48 np0005554845 dnf[30834]: NFV SIG OpenvSwitch                              21 MB/s | 456 kB     00:00
Dec 11 00:35:48 np0005554845 dnf[30834]: repo-setup-centos-appstream                      95 MB/s |  26 MB     00:00
Dec 11 00:35:54 np0005554845 dnf[30834]: repo-setup-centos-baseos                         82 MB/s | 8.8 MB     00:00
Dec 11 00:35:55 np0005554845 dnf[30834]: repo-setup-centos-highavailability               34 MB/s | 744 kB     00:00
Dec 11 00:35:56 np0005554845 dnf[30834]: repo-setup-centos-powertools                     85 MB/s | 7.4 MB     00:00
Dec 11 00:35:59 np0005554845 dnf[30834]: Extra Packages for Enterprise Linux 9 - x86_64   16 MB/s |  20 MB     00:01
Dec 11 00:36:12 np0005554845 dnf[30834]: Metadata cache created.
Dec 11 00:36:12 np0005554845 systemd[1]: dnf-makecache.service: Deactivated successfully.
Dec 11 00:36:12 np0005554845 systemd[1]: Finished dnf makecache.
Dec 11 00:36:12 np0005554845 systemd[1]: dnf-makecache.service: Consumed 24.749s CPU time.
Dec 11 00:37:11 np0005554845 systemd-logind[789]: New session 9 of user zuul.
Dec 11 00:37:11 np0005554845 systemd[1]: Started Session 9 of User zuul.
Dec 11 00:37:13 np0005554845 python3.9[31091]: ansible-ansible.legacy.setup Invoked with gather_subset=['all'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:37:15 np0005554845 python3.9[31272]: ansible-ansible.legacy.command Invoked with _raw_params=set -euxo pipefail#012pushd /var/tmp#012curl -sL https://github.com/openstack-k8s-operators/repo-setup/archive/refs/heads/main.tar.gz | tar -xz#012pushd repo-setup-main#012python3 -m venv ./venv#012PBR_VERSION=0.0.0 ./venv/bin/pip install ./#012./venv/bin/repo-setup current-podified -b antelope#012popd#012rm -rf repo-setup-main#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:37:23 np0005554845 systemd[1]: session-9.scope: Deactivated successfully.
Dec 11 00:37:23 np0005554845 systemd[1]: session-9.scope: Consumed 8.277s CPU time.
Dec 11 00:37:23 np0005554845 systemd-logind[789]: Session 9 logged out. Waiting for processes to exit.
Dec 11 00:37:23 np0005554845 systemd-logind[789]: Removed session 9.
Dec 11 00:37:28 np0005554845 systemd-logind[789]: New session 10 of user zuul.
Dec 11 00:37:28 np0005554845 systemd[1]: Started Session 10 of User zuul.
Dec 11 00:37:29 np0005554845 python3.9[31483]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:37:30 np0005554845 systemd[1]: session-10.scope: Deactivated successfully.
Dec 11 00:37:30 np0005554845 systemd-logind[789]: Session 10 logged out. Waiting for processes to exit.
Dec 11 00:37:30 np0005554845 systemd-logind[789]: Removed session 10.
Dec 11 00:37:45 np0005554845 systemd-logind[789]: New session 11 of user zuul.
Dec 11 00:37:45 np0005554845 systemd[1]: Started Session 11 of User zuul.
Dec 11 00:37:46 np0005554845 python3.9[31664]: ansible-ansible.legacy.ping Invoked with data=pong
Dec 11 00:37:47 np0005554845 python3.9[31838]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:37:48 np0005554845 python3.9[31990]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:37:49 np0005554845 python3.9[32143]: ansible-ansible.builtin.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:37:50 np0005554845 python3.9[32295]: ansible-ansible.builtin.file Invoked with mode=755 path=/etc/ansible/facts.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:37:51 np0005554845 python3.9[32447]: ansible-ansible.legacy.stat Invoked with path=/etc/ansible/facts.d/bootc.fact follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:37:52 np0005554845 python3.9[32570]: ansible-ansible.legacy.copy Invoked with dest=/etc/ansible/facts.d/bootc.fact mode=755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431471.1457946-180-18335222871807/.source.fact _original_basename=bootc.fact follow=False checksum=eb4122ce7fc50a38407beb511c4ff8c178005b12 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:37:53 np0005554845 python3.9[32722]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:37:54 np0005554845 python3.9[32878]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/log/journal setype=var_log_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:37:55 np0005554845 python3.9[33030]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/config-data/ansible-generated recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:37:55 np0005554845 python3.9[33180]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:38:01 np0005554845 python3.9[33434]: ansible-ansible.builtin.lineinfile Invoked with line=cloud-init=disabled path=/proc/cmdline state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:38:02 np0005554845 python3.9[33584]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:38:03 np0005554845 python3.9[33738]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:38:05 np0005554845 python3.9[33896]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:38:06 np0005554845 python3.9[33980]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:38:49 np0005554845 systemd[1]: Reloading.
Dec 11 00:38:49 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:38:49 np0005554845 systemd[1]: Listening on Device-mapper event daemon FIFOs.
Dec 11 00:38:49 np0005554845 systemd[1]: Reloading.
Dec 11 00:38:49 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:38:49 np0005554845 systemd[1]: Starting Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling...
Dec 11 00:38:49 np0005554845 systemd[1]: Finished Monitoring of LVM2 mirrors, snapshots etc. using dmeventd or progress polling.
Dec 11 00:38:49 np0005554845 systemd[1]: Reloading.
Dec 11 00:38:49 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:38:49 np0005554845 systemd[1]: Listening on LVM2 poll daemon socket.
Dec 11 00:38:50 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:38:50 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:39:54 np0005554845 kernel: SELinux:  Converting 2720 SID table entries...
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:39:54 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:39:54 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=8 res=1
Dec 11 00:39:54 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:39:54 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:39:54 np0005554845 systemd[1]: Reloading.
Dec 11 00:39:54 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:39:54 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:39:56 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:39:56 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:39:56 np0005554845 systemd[1]: man-db-cache-update.service: Consumed 1.343s CPU time.
Dec 11 00:39:56 np0005554845 systemd[1]: run-r4118fa361204465fa12b113582e8a78e.service: Deactivated successfully.
Dec 11 00:40:04 np0005554845 python3.9[35482]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:40:07 np0005554845 python3.9[35763]: ansible-ansible.posix.selinux Invoked with policy=targeted state=enforcing configfile=/etc/selinux/config update_kernel_param=False
Dec 11 00:40:08 np0005554845 python3.9[35915]: ansible-ansible.legacy.command Invoked with cmd=dd if=/dev/zero of=/swap count=1024 bs=1M creates=/swap _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None removes=None stdin=None
Dec 11 00:40:10 np0005554845 python3.9[36068]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/swap recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:40:15 np0005554845 python3.9[36220]: ansible-ansible.posix.mount Invoked with dump=0 fstype=swap name=none opts=sw passno=0 src=/swap state=present path=none boot=True opts_no_log=False backup=False fstab=None
Dec 11 00:40:17 np0005554845 python3.9[36373]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/ca-trust/source/anchors setype=cert_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:40:18 np0005554845 python3.9[36525]: ansible-ansible.legacy.stat Invoked with path=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:40:19 np0005554845 python3.9[36648]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/ca-trust/source/anchors/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431618.414923-668-256907520143666/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:40:23 np0005554845 python3.9[36800]: ansible-ansible.builtin.stat Invoked with path=/etc/lvm/devices/system.devices follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:40:24 np0005554845 python3.9[36952]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/vgimportdevices --all _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:40:25 np0005554845 python3.9[37105]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/lvm/devices/system.devices state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:40:26 np0005554845 python3.9[37257]: ansible-ansible.builtin.getent Invoked with database=passwd key=qemu fail_key=True service=None split=None
Dec 11 00:40:26 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:40:27 np0005554845 python3.9[37411]: ansible-ansible.builtin.group Invoked with gid=107 name=qemu state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:40:29 np0005554845 python3.9[37569]: ansible-ansible.builtin.user Invoked with comment=qemu user group=qemu groups=[''] name=qemu shell=/sbin/nologin state=present uid=107 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 00:40:30 np0005554845 python3.9[37729]: ansible-ansible.builtin.getent Invoked with database=passwd key=hugetlbfs fail_key=True service=None split=None
Dec 11 00:40:30 np0005554845 python3.9[37882]: ansible-ansible.builtin.group Invoked with gid=42477 name=hugetlbfs state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:40:31 np0005554845 python3.9[38040]: ansible-ansible.builtin.file Invoked with group=qemu mode=0755 owner=qemu path=/var/lib/vhost_sockets setype=virt_cache_t seuser=system_u state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None serole=None selevel=None attributes=None
Dec 11 00:40:33 np0005554845 python3.9[38192]: ansible-ansible.legacy.dnf Invoked with name=['dracut-config-generic'] state=absent allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:40:37 np0005554845 python3.9[38345]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/modules-load.d setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:40:37 np0005554845 python3.9[38497]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:40:38 np0005554845 python3.9[38620]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765431637.3591716-1025-106494024898290/.source.conf follow=False _original_basename=edpm-modprobe.conf.j2 checksum=8021efe01721d8fa8cab46b95c00ec1be6dbb9d0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:40:39 np0005554845 python3.9[38772]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:40:39 np0005554845 systemd[1]: Starting Load Kernel Modules...
Dec 11 00:40:39 np0005554845 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this.
Dec 11 00:40:39 np0005554845 kernel: Bridge firewalling registered
Dec 11 00:40:39 np0005554845 systemd-modules-load[38776]: Inserted module 'br_netfilter'
Dec 11 00:40:39 np0005554845 systemd[1]: Finished Load Kernel Modules.
Dec 11 00:40:40 np0005554845 python3.9[38932]: ansible-ansible.legacy.stat Invoked with path=/etc/sysctl.d/99-edpm.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:40:41 np0005554845 python3.9[39055]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysctl.d/99-edpm.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765431640.2488236-1095-59499085701590/.source.conf follow=False _original_basename=edpm-sysctl.conf.j2 checksum=2a366439721b855adcfe4d7f152babb68596a007 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:40:42 np0005554845 python3.9[39207]: ansible-ansible.legacy.dnf Invoked with name=['tuned', 'tuned-profiles-cpu-partitioning'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:40:45 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:40:45 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:40:45 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:40:45 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:40:45 np0005554845 systemd[1]: Reloading.
Dec 11 00:40:45 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:40:46 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:40:48 np0005554845 python3.9[41755]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/active_profile follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:40:49 np0005554845 python3.9[42616]: ansible-ansible.builtin.slurp Invoked with src=/etc/tuned/active_profile
Dec 11 00:40:50 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:40:50 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:40:50 np0005554845 systemd[1]: man-db-cache-update.service: Consumed 5.576s CPU time.
Dec 11 00:40:50 np0005554845 systemd[1]: run-r7a35cb1c00764b0990c0352c548308b4.service: Deactivated successfully.
Dec 11 00:40:50 np0005554845 python3.9[43228]: ansible-ansible.builtin.stat Invoked with path=/etc/tuned/throughput-performance-variables.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:40:51 np0005554845 python3.9[43380]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/tuned-adm profile throughput-performance _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:40:51 np0005554845 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 11 00:40:52 np0005554845 systemd[1]: Starting Authorization Manager...
Dec 11 00:40:52 np0005554845 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 11 00:40:52 np0005554845 polkitd[43597]: Started polkitd version 0.117
Dec 11 00:40:52 np0005554845 systemd[1]: Started Authorization Manager.
Dec 11 00:40:53 np0005554845 python3.9[43767]: ansible-ansible.builtin.systemd Invoked with enabled=True name=tuned state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:40:53 np0005554845 systemd[1]: Stopping Dynamic System Tuning Daemon...
Dec 11 00:40:53 np0005554845 systemd[1]: tuned.service: Deactivated successfully.
Dec 11 00:40:53 np0005554845 systemd[1]: Stopped Dynamic System Tuning Daemon.
Dec 11 00:40:53 np0005554845 systemd[1]: Starting Dynamic System Tuning Daemon...
Dec 11 00:40:53 np0005554845 systemd[1]: Started Dynamic System Tuning Daemon.
Dec 11 00:40:55 np0005554845 python3.9[43928]: ansible-ansible.builtin.slurp Invoked with src=/proc/cmdline
Dec 11 00:40:59 np0005554845 python3.9[44080]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksm.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:40:59 np0005554845 systemd[1]: Reloading.
Dec 11 00:40:59 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:41:00 np0005554845 python3.9[44269]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ksmtuned.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:41:00 np0005554845 systemd[1]: Reloading.
Dec 11 00:41:00 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:41:01 np0005554845 python3.9[44458]: ansible-ansible.legacy.command Invoked with _raw_params=mkswap "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:02 np0005554845 python3.9[44611]: ansible-ansible.legacy.command Invoked with _raw_params=swapon "/swap" _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:02 np0005554845 kernel: Adding 1048572k swap on /swap.  Priority:-2 extents:1 across:1048572k 
Dec 11 00:41:03 np0005554845 python3.9[44764]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/bin/update-ca-trust _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:05 np0005554845 python3.9[44926]: ansible-ansible.legacy.command Invoked with _raw_params=echo 2 >/sys/kernel/mm/ksm/run _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:06 np0005554845 python3.9[45079]: ansible-ansible.builtin.systemd Invoked with name=systemd-sysctl.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:41:06 np0005554845 systemd[1]: systemd-sysctl.service: Deactivated successfully.
Dec 11 00:41:06 np0005554845 systemd[1]: Stopped Apply Kernel Variables.
Dec 11 00:41:06 np0005554845 systemd[1]: Stopping Apply Kernel Variables...
Dec 11 00:41:06 np0005554845 systemd[1]: Starting Apply Kernel Variables...
Dec 11 00:41:06 np0005554845 systemd[1]: run-credentials-systemd\x2dsysctl.service.mount: Deactivated successfully.
Dec 11 00:41:06 np0005554845 systemd[1]: Finished Apply Kernel Variables.
Dec 11 00:41:07 np0005554845 systemd[1]: session-11.scope: Deactivated successfully.
Dec 11 00:41:07 np0005554845 systemd[1]: session-11.scope: Consumed 2min 18.579s CPU time.
Dec 11 00:41:07 np0005554845 systemd-logind[789]: Session 11 logged out. Waiting for processes to exit.
Dec 11 00:41:07 np0005554845 systemd-logind[789]: Removed session 11.
Dec 11 00:41:12 np0005554845 systemd-logind[789]: New session 12 of user zuul.
Dec 11 00:41:12 np0005554845 systemd[1]: Started Session 12 of User zuul.
Dec 11 00:41:13 np0005554845 python3.9[45262]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:41:15 np0005554845 python3.9[45416]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:41:16 np0005554845 python3.9[45572]: ansible-ansible.legacy.command Invoked with _raw_params=PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin which growvols#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:19 np0005554845 python3.9[45723]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:41:20 np0005554845 python3.9[45879]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:41:21 np0005554845 python3.9[45963]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:41:24 np0005554845 python3.9[46116]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:41:25 np0005554845 python3.9[46287]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:41:26 np0005554845 python3.9[46439]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:41:26 np0005554845 podman[46440]: 2025-12-11 05:41:26.427676712 +0000 UTC m=+0.050907964 system refresh
Dec 11 00:41:27 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:41:27 np0005554845 python3.9[46602]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:41:28 np0005554845 python3.9[46725]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/networks/podman.json group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431686.8426473-289-259853216805410/.source.json follow=False _original_basename=podman_network_config.j2 checksum=7bfbdcae7769a0002a8b8b47fdf6129b5b23cccf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:41:29 np0005554845 python3.9[46877]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:41:29 np0005554845 python3.9[47000]: ansible-ansible.legacy.copy Invoked with dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf group=root mode=0644 owner=root setype=etc_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765431688.486536-334-108531692442780/.source.conf follow=False _original_basename=registries.conf.j2 checksum=439b91f35f15c57e9eb8dc822211d5da71a6cca8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:41:30 np0005554845 python3.9[47152]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:41:31 np0005554845 python3.9[47304]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:41:31 np0005554845 python3.9[47456]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:41:32 np0005554845 python3.9[47608]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:41:33 np0005554845 python3.9[47758]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:41:34 np0005554845 python3.9[47912]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:36 np0005554845 python3.9[48065]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openstack-network-scripts'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:40 np0005554845 python3.9[48225]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['podman', 'buildah'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:42 np0005554845 python3.9[48378]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['tuned', 'tuned-profiles-cpu-partitioning'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:44 np0005554845 python3.9[48531]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['NetworkManager-ovs'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:47 np0005554845 python3.9[48687]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['os-net-config'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:52 np0005554845 python3.9[48857]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openssh-server'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:41:54 np0005554845 python3.9[49010]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:42:09 np0005554845 python3.9[49347]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['iscsi-initiator-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:42:11 np0005554845 python3.9[49503]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:42:12 np0005554845 python3.9[49678]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:42:12 np0005554845 python3.9[49801]: ansible-ansible.legacy.copy Invoked with dest=/root/.config/containers/auth.json group=zuul mode=0660 owner=zuul src=/home/zuul/.ansible/tmp/ansible-tmp-1765431731.87258-778-58680578070381/.source.json _original_basename=.gxaryxip follow=False checksum=bf21a9e8fbc5a3846fb05b4fa0859e0917b2202f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:42:13 np0005554845 python3.9[49953]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:42:14 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:16 np0005554845 systemd[1]: var-lib-containers-storage-overlay-compat2039428102-lower\x2dmapped.mount: Deactivated successfully.
Dec 11 00:42:19 np0005554845 podman[49965]: 2025-12-11 05:42:19.618775113 +0000 UTC m=+5.576007479 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 11 00:42:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:24 np0005554845 python3.9[50261]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:42:24 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:36 np0005554845 podman[50274]: 2025-12-11 05:42:36.295822866 +0000 UTC m=+11.580616370 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 00:42:36 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:36 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:36 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:38 np0005554845 python3.9[50575]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:42:38 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:40 np0005554845 podman[50586]: 2025-12-11 05:42:40.134675576 +0000 UTC m=+1.112823463 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 11 00:42:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:41 np0005554845 python3.9[50822]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:42:41 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:54 np0005554845 podman[50836]: 2025-12-11 05:42:54.488146146 +0000 UTC m=+13.016238461 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 11 00:42:54 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:54 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:42:54 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:03 np0005554845 python3.9[51098]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:43:03 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:07 np0005554845 podman[51110]: 2025-12-11 05:43:07.213131646 +0000 UTC m=+3.647153326 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 11 00:43:07 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:07 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:07 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:08 np0005554845 python3.9[51367]: ansible-containers.podman.podman_image Invoked with auth_file=/root/.config/containers/auth.json name=quay.io/prometheus/node-exporter:v1.5.0 tag=latest pull=True push=False force=False state=present executable=podman build={'force_rm': False, 'format': 'oci', 'cache': True, 'rm': True, 'annotation': None, 'file': None, 'container_file': None, 'volume': None, 'extra_args': None, 'target': None} push_args={'ssh': None, 'compress': None, 'format': None, 'remove_signatures': None, 'sign_by': None, 'dest': None, 'extra_args': None, 'transport': None} arch=None pull_extra_args=None path=None validate_certs=None username=None password=NOT_LOGGING_PARAMETER ca_cert_dir=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None
Dec 11 00:43:09 np0005554845 podman[51380]: 2025-12-11 05:43:09.663283539 +0000 UTC m=+1.493215265 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 11 00:43:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:43:12 np0005554845 systemd-logind[789]: Session 12 logged out. Waiting for processes to exit.
Dec 11 00:43:12 np0005554845 systemd[1]: session-12.scope: Deactivated successfully.
Dec 11 00:43:12 np0005554845 systemd[1]: session-12.scope: Consumed 1min 51.520s CPU time.
Dec 11 00:43:12 np0005554845 systemd-logind[789]: Removed session 12.
Dec 11 00:43:18 np0005554845 systemd-logind[789]: New session 13 of user zuul.
Dec 11 00:43:18 np0005554845 systemd[1]: Started Session 13 of User zuul.
Dec 11 00:43:19 np0005554845 python3.9[51682]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:43:20 np0005554845 python3.9[51838]: ansible-ansible.builtin.getent Invoked with database=passwd key=openvswitch fail_key=True service=None split=None
Dec 11 00:43:21 np0005554845 python3.9[51991]: ansible-ansible.builtin.group Invoked with gid=42476 name=openvswitch state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:43:22 np0005554845 python3.9[52149]: ansible-ansible.builtin.user Invoked with comment=openvswitch user group=openvswitch groups=['hugetlbfs'] name=openvswitch shell=/sbin/nologin state=present uid=42476 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 00:43:25 np0005554845 python3.9[52309]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:43:26 np0005554845 python3.9[52393]: ansible-ansible.legacy.dnf Invoked with download_only=True name=['openvswitch'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:43:28 np0005554845 python3.9[52555]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:43:41 np0005554845 kernel: SELinux:  Converting 2733 SID table entries...
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:43:41 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:43:41 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=9 res=1
Dec 11 00:43:41 np0005554845 systemd[1]: Started daily update of the root trust anchor for DNSSEC.
Dec 11 00:43:43 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:43:43 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:43:43 np0005554845 systemd[1]: Reloading.
Dec 11 00:43:43 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:43:43 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:43:43 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:43:43 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:43:43 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:43:43 np0005554845 systemd[1]: run-r69f2e9ada62a447ab22e22bb800acc88.service: Deactivated successfully.
Dec 11 00:43:48 np0005554845 python3.9[53652]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:43:48 np0005554845 systemd[1]: Reloading.
Dec 11 00:43:48 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:43:48 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:43:49 np0005554845 systemd[1]: Starting Open vSwitch Database Unit...
Dec 11 00:43:49 np0005554845 chown[53695]: /usr/bin/chown: cannot access '/run/openvswitch': No such file or directory
Dec 11 00:43:49 np0005554845 ovs-ctl[53700]: /etc/openvswitch/conf.db does not exist ... (warning).
Dec 11 00:43:49 np0005554845 ovs-ctl[53700]: Creating empty database /etc/openvswitch/conf.db [  OK  ]
Dec 11 00:43:49 np0005554845 ovs-ctl[53700]: Starting ovsdb-server [  OK  ]
Dec 11 00:43:49 np0005554845 ovs-vsctl[53749]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait -- init -- set Open_vSwitch . db-version=8.5.1
Dec 11 00:43:49 np0005554845 ovs-vsctl[53769]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait set Open_vSwitch . ovs-version=3.3.5-115.el9s "external-ids:system-id=\"3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459\"" "external-ids:rundir=\"/var/run/openvswitch\"" "system-type=\"centos\"" "system-version=\"9\""
Dec 11 00:43:49 np0005554845 ovs-ctl[53700]: Configuring Open vSwitch system IDs [  OK  ]
Dec 11 00:43:49 np0005554845 ovs-vsctl[53775]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec 11 00:43:49 np0005554845 ovs-ctl[53700]: Enabling remote OVSDB managers [  OK  ]
Dec 11 00:43:49 np0005554845 systemd[1]: Started Open vSwitch Database Unit.
Dec 11 00:43:49 np0005554845 systemd[1]: Starting Open vSwitch Delete Transient Ports...
Dec 11 00:43:49 np0005554845 systemd[1]: Finished Open vSwitch Delete Transient Ports.
Dec 11 00:43:49 np0005554845 systemd[1]: Starting Open vSwitch Forwarding Unit...
Dec 11 00:43:49 np0005554845 kernel: openvswitch: Open vSwitch switching datapath
Dec 11 00:43:49 np0005554845 ovs-ctl[53820]: Inserting openvswitch module [  OK  ]
Dec 11 00:43:49 np0005554845 ovs-ctl[53789]: Starting ovs-vswitchd [  OK  ]
Dec 11 00:43:49 np0005554845 ovs-vsctl[53837]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --no-wait add Open_vSwitch . external-ids hostname=compute-2
Dec 11 00:43:49 np0005554845 ovs-ctl[53789]: Enabling remote OVSDB managers [  OK  ]
Dec 11 00:43:49 np0005554845 systemd[1]: Started Open vSwitch Forwarding Unit.
Dec 11 00:43:49 np0005554845 systemd[1]: Starting Open vSwitch...
Dec 11 00:43:49 np0005554845 systemd[1]: Finished Open vSwitch.
Dec 11 00:43:50 np0005554845 python3.9[53989]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:43:52 np0005554845 python3.9[54141]: ansible-community.general.sefcontext Invoked with selevel=s0 setype=container_file_t state=present target=/var/lib/edpm-config(/.*)? ignore_selinux_state=False ftype=a reload=True substitute=None seuser=None
Dec 11 00:43:53 np0005554845 kernel: SELinux:  Converting 2747 SID table entries...
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:43:53 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:43:54 np0005554845 python3.9[54296]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local', 'distribution'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:43:55 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=10 res=1
Dec 11 00:43:56 np0005554845 python3.9[54454]: ansible-ansible.legacy.dnf Invoked with name=['driverctl', 'lvm2', 'crudini', 'jq', 'nftables', 'NetworkManager', 'openstack-selinux', 'python3-libselinux', 'python3-pyyaml', 'rsync', 'tmpwatch', 'sysstat', 'iproute-tc', 'ksmtuned', 'systemd-container', 'crypto-policies-scripts', 'grubby', 'sos'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:43:58 np0005554845 python3.9[54607]: ansible-ansible.legacy.command Invoked with _raw_params=rpm -V driverctl lvm2 crudini jq nftables NetworkManager openstack-selinux python3-libselinux python3-pyyaml rsync tmpwatch sysstat iproute-tc ksmtuned systemd-container crypto-policies-scripts grubby sos _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:43:59 np0005554845 python3.9[54894]: ansible-ansible.builtin.file Invoked with mode=0750 path=/var/lib/edpm-config selevel=s0 setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 00:44:00 np0005554845 python3.9[55044]: ansible-ansible.builtin.stat Invoked with path=/etc/cloud/cloud.cfg.d follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:44:01 np0005554845 python3.9[55198]: ansible-ansible.legacy.dnf Invoked with name=['NetworkManager-ovs'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:44:03 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:44:03 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:44:03 np0005554845 systemd[1]: Reloading.
Dec 11 00:44:03 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:44:03 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:44:03 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:44:04 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:44:04 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:44:04 np0005554845 systemd[1]: run-r8b96e322266e4546b6bbd290d6ea43c4.service: Deactivated successfully.
Dec 11 00:44:05 np0005554845 python3.9[55516]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:44:05 np0005554845 systemd[1]: NetworkManager-wait-online.service: Deactivated successfully.
Dec 11 00:44:05 np0005554845 systemd[1]: Stopped Network Manager Wait Online.
Dec 11 00:44:05 np0005554845 systemd[1]: Stopping Network Manager Wait Online...
Dec 11 00:44:05 np0005554845 systemd[1]: Stopping Network Manager...
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2864] caught SIGTERM, shutting down normally.
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2889] dhcp4 (eth0): canceled DHCP transaction
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2889] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2889] dhcp4 (eth0): state changed no lease
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2894] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 00:44:05 np0005554845 NetworkManager[7186]: <info>  [1765431845.2968] exiting (success)
Dec 11 00:44:05 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:44:05 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:44:05 np0005554845 systemd[1]: NetworkManager.service: Deactivated successfully.
Dec 11 00:44:05 np0005554845 systemd[1]: Stopped Network Manager.
Dec 11 00:44:05 np0005554845 systemd[1]: NetworkManager.service: Consumed 13.586s CPU time, 4.1M memory peak, read 0B from disk, written 34.0K to disk.
Dec 11 00:44:05 np0005554845 systemd[1]: Starting Network Manager...
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.4129] NetworkManager (version 1.54.2-1.el9) is starting... (after a restart, boot:6fd4de4e-7b3d-47fa-92ea-8323052e6a02)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.4130] Read config: /etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.4214] manager[0x55f61e627000]: monitoring kernel firmware directory '/lib/firmware'.
Dec 11 00:44:05 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 00:44:05 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5254] hostname: hostname: using hostnamed
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5255] hostname: static hostname changed from (none) to "compute-2"
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5262] dns-mgr: init: dns=default,systemd-resolved rc-manager=symlink (auto)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5267] manager[0x55f61e627000]: rfkill: Wi-Fi hardware radio set enabled
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5267] manager[0x55f61e627000]: rfkill: WWAN hardware radio set enabled
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5290] Loaded device plugin: NMOvsFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-ovs.so)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5300] Loaded device plugin: NMTeamFactory (/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-device-plugin-team.so)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5301] manager: rfkill: Wi-Fi enabled by radio killswitch; enabled by state file
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5301] manager: rfkill: WWAN enabled by radio killswitch; enabled by state file
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5301] manager: Networking is enabled by state file
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5304] settings: Loaded settings plugin: keyfile (internal)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5308] settings: Loaded settings plugin: ifcfg-rh ("/usr/lib64/NetworkManager/1.54.2-1.el9/libnm-settings-plugin-ifcfg-rh.so")
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5335] Warning: the ifcfg-rh plugin is deprecated, please migrate connections to the keyfile format using "nmcli connection migrate"
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5346] dhcp: init: Using DHCP client 'internal'
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5349] manager: (lo): new Loopback device (/org/freedesktop/NetworkManager/Devices/1)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5354] device (lo): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5359] device (lo): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5366] device (lo): Activation: starting connection 'lo' (98463273-a93b-4216-b1e6-11bf6b9079be)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5373] device (eth0): carrier: link connected
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5378] manager: (eth0): new Ethernet device (/org/freedesktop/NetworkManager/Devices/2)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5382] manager: (eth0): assume: will attempt to assume matching connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03) (indicated)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5382] device (eth0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5388] device (eth0): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5394] device (eth0): Activation: starting connection 'System eth0' (5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5399] device (eth1): carrier: link connected
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5402] manager: (eth1): new Ethernet device (/org/freedesktop/NetworkManager/Devices/3)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5408] manager: (eth1): assume: will attempt to assume matching connection 'ci-private-network' (b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856) (indicated)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5409] device (eth1): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5413] device (eth1): state change: unavailable -> disconnected (reason 'connection-assumed', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5420] device (eth1): Activation: starting connection 'ci-private-network' (b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5425] bus-manager: acquired D-Bus service "org.freedesktop.NetworkManager"
Dec 11 00:44:05 np0005554845 systemd[1]: Started Network Manager.
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5434] device (lo): state change: disconnected -> prepare (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5437] device (lo): state change: prepare -> config (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5439] device (lo): state change: config -> ip-config (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5441] device (eth0): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5445] device (eth0): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5447] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5449] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5452] device (lo): state change: ip-config -> ip-check (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5471] device (eth0): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5473] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5487] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5502] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5514] device (lo): state change: ip-check -> secondaries (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5516] device (lo): state change: secondaries -> activated (reason 'none', managed-type: 'external')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5520] device (lo): Activation: successful, device activated.
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5546] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5547] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5549] manager: NetworkManager state is now CONNECTED_LOCAL
Dec 11 00:44:05 np0005554845 NetworkManager[55529]: <info>  [1765431845.5551] device (eth1): Activation: successful, device activated.
Dec 11 00:44:05 np0005554845 systemd[1]: Starting Network Manager Wait Online...
Dec 11 00:44:06 np0005554845 python3.9[55723]: ansible-ansible.legacy.dnf Invoked with name=['os-net-config'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.1915] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.1927] policy: set 'System eth0' (eth0) as default for IPv4 routing and DNS
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.1996] device (eth0): state change: ip-config -> ip-check (reason 'none', managed-type: 'assume')
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2034] device (eth0): state change: ip-check -> secondaries (reason 'none', managed-type: 'assume')
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2036] device (eth0): state change: secondaries -> activated (reason 'none', managed-type: 'assume')
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2039] manager: NetworkManager state is now CONNECTED_SITE
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2041] device (eth0): Activation: successful, device activated.
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2046] manager: NetworkManager state is now CONNECTED_GLOBAL
Dec 11 00:44:07 np0005554845 NetworkManager[55529]: <info>  [1765431847.2048] manager: startup complete
Dec 11 00:44:07 np0005554845 systemd[1]: Finished Network Manager Wait Online.
Dec 11 00:44:10 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:44:10 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:44:10 np0005554845 systemd[1]: Reloading.
Dec 11 00:44:10 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:44:10 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:44:10 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:44:11 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:44:11 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:44:11 np0005554845 systemd[1]: run-r944dc29758e1479f9be0f216f7bceacf.service: Deactivated successfully.
Dec 11 00:44:14 np0005554845 python3.9[56200]: ansible-ansible.builtin.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:44:15 np0005554845 python3.9[56352]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=no-auto-default path=/etc/NetworkManager/NetworkManager.conf section=main state=present value=* exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:16 np0005554845 python3.9[56506]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:17 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:44:17 np0005554845 python3.9[56658]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=dns path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=none exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:18 np0005554845 python3.9[56810]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/NetworkManager.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:18 np0005554845 python3.9[56962]: ansible-community.general.ini_file Invoked with backup=True mode=0644 no_extra_spaces=True option=rc-manager path=/etc/NetworkManager/conf.d/99-cloud-init.conf section=main state=absent value=unmanaged exclusive=True ignore_spaces=False allow_no_value=False modify_inactive_option=True create=True follow=False unsafe_writes=False section_has_values=None values=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:19 np0005554845 python3.9[57114]: ansible-ansible.legacy.stat Invoked with path=/etc/dhcp/dhclient-enter-hooks follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:44:20 np0005554845 python3.9[57237]: ansible-ansible.legacy.copy Invoked with dest=/etc/dhcp/dhclient-enter-hooks mode=0755 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431859.2281623-649-250934411917997/.source _original_basename=.uet8l_pq follow=False checksum=f6278a40de79a9841f6ed1fc584538225566990c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:21 np0005554845 python3.9[57389]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/os-net-config state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:22 np0005554845 python3.9[57541]: ansible-edpm_os_net_config_mappings Invoked with net_config_data_lookup={}
Dec 11 00:44:22 np0005554845 python3.9[57693]: ansible-ansible.builtin.file Invoked with path=/var/lib/edpm-config/scripts state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:25 np0005554845 python3.9[58120]: ansible-ansible.builtin.slurp Invoked with path=/etc/os-net-config/config.yaml src=/etc/os-net-config/config.yaml
Dec 11 00:44:26 np0005554845 ansible-async_wrapper.py[58295]: Invoked with j371523542074 300 /home/zuul/.ansible/tmp/ansible-tmp-1765431865.7859433-847-15400035031840/AnsiballZ_edpm_os_net_config.py _
Dec 11 00:44:26 np0005554845 ansible-async_wrapper.py[58298]: Starting module and watcher
Dec 11 00:44:26 np0005554845 ansible-async_wrapper.py[58298]: Start watching 58299 (300)
Dec 11 00:44:26 np0005554845 ansible-async_wrapper.py[58299]: Start module (58299)
Dec 11 00:44:26 np0005554845 ansible-async_wrapper.py[58295]: Return async_wrapper task started.
Dec 11 00:44:26 np0005554845 python3.9[58300]: ansible-edpm_os_net_config Invoked with cleanup=True config_file=/etc/os-net-config/config.yaml debug=True detailed_exit_codes=True safe_defaults=False use_nmstate=True
Dec 11 00:44:27 np0005554845 kernel: cfg80211: Loading compiled-in X.509 certificates for regulatory database
Dec 11 00:44:27 np0005554845 kernel: Loaded X.509 cert 'sforshee: 00b28ddf47aef9cea7'
Dec 11 00:44:27 np0005554845 kernel: Loaded X.509 cert 'wens: 61c038651aabdcf94bd0ac7ff06c7248db18c600'
Dec 11 00:44:27 np0005554845 kernel: platform regulatory.0: Direct firmware load for regulatory.db failed with error -2
Dec 11 00:44:27 np0005554845 kernel: cfg80211: failed to load regulatory.db
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.4674] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.4695] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5189] manager: (br-ex): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/4)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5190] audit: op="connection-add" uuid="1d1999d6-7243-40ac-8cc0-ade003a093a4" name="br-ex-br" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5203] manager: (br-ex): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/5)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5204] audit: op="connection-add" uuid="d1de3069-1d70-4dae-9ef4-7c242a9f63b2" name="br-ex-port" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5214] manager: (eth1): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/6)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5215] audit: op="connection-add" uuid="839bc8c2-780e-49aa-ba75-e28ddd390e7e" name="eth1-port" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5225] manager: (vlan20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/7)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5226] audit: op="connection-add" uuid="237f25f3-6a1c-4a37-b99d-8ede9f38dfd8" name="vlan20-port" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5237] manager: (vlan21): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/8)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5238] audit: op="connection-add" uuid="ed05a355-1a77-428d-89de-2a24bde46f37" name="vlan21-port" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5248] manager: (vlan22): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/9)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5250] audit: op="connection-add" uuid="d6e5b31a-5bb8-4b8f-8a48-060d08190e49" name="vlan22-port" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5267] audit: op="connection-update" uuid="5fb06bd0-0bb0-7ffb-45f1-d6edd65f3e03" name="System eth0" args="ipv6.addr-gen-mode,ipv6.dhcp-timeout,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority,connection.timestamp" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5282] manager: (br-ex): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/10)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5283] audit: op="connection-add" uuid="fdea7040-e025-4789-b322-29bc528dd774" name="br-ex-if" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5335] audit: op="connection-update" uuid="b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856" name="ci-private-network" args="ipv6.routing-rules,ipv6.dns,ipv6.addr-gen-mode,ipv6.addresses,ipv6.method,ipv6.routes,ipv4.routing-rules,ipv4.dns,ipv4.never-default,ipv4.addresses,ipv4.method,ipv4.routes,ovs-external-ids.data,connection.master,connection.controller,connection.timestamp,connection.slave-type,connection.port-type,ovs-interface.type" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5350] manager: (vlan20): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/11)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5351] audit: op="connection-add" uuid="db84cb17-5e98-4328-a559-da6bd9d55f0a" name="vlan20-if" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5364] manager: (vlan21): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/12)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5365] audit: op="connection-add" uuid="bdf2c9f0-6a6a-415a-a98d-2aca712033eb" name="vlan21-if" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5378] manager: (vlan22): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/13)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5379] audit: op="connection-add" uuid="3968ea06-968a-4b7c-9716-cbafe3e3f845" name="vlan22-if" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5388] audit: op="connection-delete" uuid="d75b84f7-0cdf-34c5-b61b-91cf7fbf3605" name="Wired connection 1" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5398] device (br-ex)[Open vSwitch Bridge]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5401] device (br-ex)[Open vSwitch Bridge]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5406] device (br-ex)[Open vSwitch Bridge]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5409] device (br-ex)[Open vSwitch Bridge]: Activation: starting connection 'br-ex-br' (1d1999d6-7243-40ac-8cc0-ade003a093a4)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5409] audit: op="connection-activate" uuid="1d1999d6-7243-40ac-8cc0-ade003a093a4" name="br-ex-br" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5411] device (br-ex)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5412] device (br-ex)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5416] device (br-ex)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5419] device (br-ex)[Open vSwitch Port]: Activation: starting connection 'br-ex-port' (d1de3069-1d70-4dae-9ef4-7c242a9f63b2)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5421] device (eth1)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5421] device (eth1)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5425] device (eth1)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5428] device (eth1)[Open vSwitch Port]: Activation: starting connection 'eth1-port' (839bc8c2-780e-49aa-ba75-e28ddd390e7e)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5429] device (vlan20)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5430] device (vlan20)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5434] device (vlan20)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5438] device (vlan20)[Open vSwitch Port]: Activation: starting connection 'vlan20-port' (237f25f3-6a1c-4a37-b99d-8ede9f38dfd8)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5439] device (vlan21)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5440] device (vlan21)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5443] device (vlan21)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5446] device (vlan21)[Open vSwitch Port]: Activation: starting connection 'vlan21-port' (ed05a355-1a77-428d-89de-2a24bde46f37)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5447] device (vlan22)[Open vSwitch Port]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5448] device (vlan22)[Open vSwitch Port]: error setting IPv4 forwarding to '1': Resource temporarily unavailable
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5452] device (vlan22)[Open vSwitch Port]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5456] device (vlan22)[Open vSwitch Port]: Activation: starting connection 'vlan22-port' (d6e5b31a-5bb8-4b8f-8a48-060d08190e49)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5457] device (br-ex)[Open vSwitch Bridge]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5459] device (br-ex)[Open vSwitch Bridge]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5462] device (br-ex)[Open vSwitch Bridge]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5468] device (br-ex)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5469] device (br-ex)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5473] device (br-ex)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5476] device (br-ex)[Open vSwitch Interface]: Activation: starting connection 'br-ex-if' (fdea7040-e025-4789-b322-29bc528dd774)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5477] device (br-ex)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5479] device (br-ex)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5481] device (br-ex)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5482] device (br-ex)[Open vSwitch Port]: Activation: connection 'br-ex-port' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5483] device (eth1): state change: activated -> deactivating (reason 'new-activation', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5490] device (eth1): disconnecting for new activation request.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5491] device (eth1)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5494] device (eth1)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5495] device (eth1)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5496] device (eth1)[Open vSwitch Port]: Activation: connection 'eth1-port' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5499] device (vlan20)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5499] device (vlan20)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5502] device (vlan20)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5505] device (vlan20)[Open vSwitch Interface]: Activation: starting connection 'vlan20-if' (db84cb17-5e98-4328-a559-da6bd9d55f0a)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5506] device (vlan20)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5508] device (vlan20)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5510] device (vlan20)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5511] device (vlan20)[Open vSwitch Port]: Activation: connection 'vlan20-port' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5513] device (vlan21)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5514] device (vlan21)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5517] device (vlan21)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5521] device (vlan21)[Open vSwitch Interface]: Activation: starting connection 'vlan21-if' (bdf2c9f0-6a6a-415a-a98d-2aca712033eb)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5522] device (vlan21)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5525] device (vlan21)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5527] device (vlan21)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5528] device (vlan21)[Open vSwitch Port]: Activation: connection 'vlan21-port' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5530] device (vlan22)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <warn>  [1765431868.5531] device (vlan22)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5535] device (vlan22)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'user-requested', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5538] device (vlan22)[Open vSwitch Interface]: Activation: starting connection 'vlan22-if' (3968ea06-968a-4b7c-9716-cbafe3e3f845)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5539] device (vlan22)[Open vSwitch Port]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5541] device (vlan22)[Open vSwitch Port]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5542] device (vlan22)[Open vSwitch Port]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5543] device (vlan22)[Open vSwitch Port]: Activation: connection 'vlan22-port' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5545] device (br-ex)[Open vSwitch Bridge]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5556] audit: op="device-reapply" interface="eth0" ifindex=2 args="ipv6.addr-gen-mode,ipv6.method,ipv4.dhcp-client-id,ipv4.dhcp-timeout,802-3-ethernet.mtu,connection.autoconnect-priority" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5557] device (br-ex)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5559] device (br-ex)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5560] device (br-ex)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5565] device (br-ex)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5567] device (eth1)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5570] device (vlan20)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 kernel: ovs-system: entered promiscuous mode
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5588] device (vlan20)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5590] device (vlan20)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5595] device (vlan20)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5598] device (vlan21)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5601] device (vlan21)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5602] device (vlan21)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5607] device (vlan21)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 systemd-udevd[58305]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:44:28 np0005554845 kernel: Timeout policy base is empty
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5610] device (vlan22)[Open vSwitch Interface]: state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5613] device (vlan22)[Open vSwitch Interface]: state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5615] device (vlan22)[Open vSwitch Interface]: state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5619] device (vlan22)[Open vSwitch Port]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5623] dhcp4 (eth0): canceled DHCP transaction
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5623] dhcp4 (eth0): activation: beginning transaction (timeout in 45 seconds)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5623] dhcp4 (eth0): state changed no lease
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5625] dhcp4 (eth0): activation: beginning transaction (no timeout)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5634] device (br-ex)[Open vSwitch Interface]: Activation: connection 'br-ex-if' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5638] audit: op="device-reapply" interface="eth1" ifindex=3 pid=58301 uid=0 result="fail" reason="Device is not activated"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5643] device (vlan20)[Open vSwitch Interface]: Activation: connection 'vlan20-if' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5674] device (vlan21)[Open vSwitch Interface]: Activation: connection 'vlan21-if' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5679] device (vlan22)[Open vSwitch Interface]: Activation: connection 'vlan22-if' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5681] dhcp4 (eth0): state changed new lease, address=38.102.83.9
Dec 11 00:44:28 np0005554845 systemd[1]: Starting Network Manager Script Dispatcher Service...
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5734] device (eth1): disconnecting for new activation request.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5735] audit: op="connection-activate" uuid="b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856" name="ci-private-network" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5763] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Dec 11 00:44:28 np0005554845 systemd[1]: Started Network Manager Script Dispatcher Service.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5823] device (eth1): state change: deactivating -> disconnected (reason 'new-activation', managed-type: 'full')
Dec 11 00:44:28 np0005554845 kernel: br-ex: entered promiscuous mode
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5949] device (eth1): Activation: starting connection 'ci-private-network' (b09f1a0b-2ca1-54a3-83c7-38f8c4ed5856)
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5954] device (br-ex)[Open vSwitch Bridge]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5961] device (eth1): state change: disconnected -> prepare (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5964] device (eth1): state change: prepare -> config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5968] device (br-ex)[Open vSwitch Bridge]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5970] device (br-ex)[Open vSwitch Bridge]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5978] device (br-ex)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5979] device (eth1)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5980] device (vlan20)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5981] device (vlan21)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5982] device (vlan22)[Open vSwitch Port]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5992] device (eth1): state change: config -> ip-config (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 kernel: vlan22: entered promiscuous mode
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.5998] device (br-ex)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6001] device (br-ex)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6004] device (eth1)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 systemd-udevd[58306]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6007] device (eth1)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6009] device (vlan20)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6012] device (vlan20)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6015] device (vlan21)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6017] device (vlan21)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6019] device (vlan22)[Open vSwitch Port]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6021] device (vlan22)[Open vSwitch Port]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6030] device (eth1): Activation: connection 'ci-private-network' attached as port, continuing activation
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6036] device (eth1): state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6042] device (br-ex)[Open vSwitch Interface]: carrier: link connected
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6048] device (eth1): state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6054] device (eth1): state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6059] device (eth1): Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6065] device (br-ex)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 kernel: vlan20: entered promiscuous mode
Dec 11 00:44:28 np0005554845 systemd-udevd[58307]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6150] device (vlan22)[Open vSwitch Interface]: carrier: link connected
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6156] device (br-ex)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6159] device (br-ex)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6165] device (br-ex)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6184] device (vlan22)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 kernel: vlan21: entered promiscuous mode
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6227] device (vlan22)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6231] device (vlan22)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6237] device (vlan22)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 kernel: virtio_net virtio5 eth1: entered promiscuous mode
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6247] device (vlan20)[Open vSwitch Interface]: carrier: link connected
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6263] device (vlan20)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6292] device (vlan20)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6294] device (vlan20)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6298] device (vlan20)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6352] device (vlan21)[Open vSwitch Interface]: carrier: link connected
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6364] device (vlan21)[Open vSwitch Interface]: state change: ip-config -> ip-check (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6381] device (vlan21)[Open vSwitch Interface]: state change: ip-check -> secondaries (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6383] device (vlan21)[Open vSwitch Interface]: state change: secondaries -> activated (reason 'none', managed-type: 'full')
Dec 11 00:44:28 np0005554845 NetworkManager[55529]: <info>  [1765431868.6387] device (vlan21)[Open vSwitch Interface]: Activation: successful, device activated.
Dec 11 00:44:29 np0005554845 NetworkManager[55529]: <info>  [1765431869.7572] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Dec 11 00:44:29 np0005554845 NetworkManager[55529]: <info>  [1765431869.9304] checkpoint[0x55f61e5fb950]: destroy /org/freedesktop/NetworkManager/Checkpoint/1
Dec 11 00:44:29 np0005554845 NetworkManager[55529]: <info>  [1765431869.9309] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/1" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.2670] audit: op="checkpoint-create" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.2686] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.4730] audit: op="networking-control" arg="global-dns-configuration" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.4756] config: signal: SET_VALUES,values,values-intern,global-dns-config (/etc/NetworkManager/NetworkManager.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf)
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.4786] audit: op="networking-control" arg="global-dns-configuration" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.4811] audit: op="checkpoint-adjust-rollback-timeout" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 python3.9[58635]: ansible-ansible.legacy.async_status Invoked with jid=j371523542074.58295 mode=status _async_dir=/root/.ansible_async
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.5988] checkpoint[0x55f61e5fba20]: destroy /org/freedesktop/NetworkManager/Checkpoint/2
Dec 11 00:44:30 np0005554845 NetworkManager[55529]: <info>  [1765431870.5994] audit: op="checkpoint-destroy" arg="/org/freedesktop/NetworkManager/Checkpoint/2" pid=58301 uid=0 result="success"
Dec 11 00:44:30 np0005554845 ansible-async_wrapper.py[58299]: Module complete (58299)
Dec 11 00:44:31 np0005554845 ansible-async_wrapper.py[58298]: Done in kid B.
Dec 11 00:44:34 np0005554845 python3.9[58739]: ansible-ansible.legacy.async_status Invoked with jid=j371523542074.58295 mode=status _async_dir=/root/.ansible_async
Dec 11 00:44:34 np0005554845 python3.9[58839]: ansible-ansible.legacy.async_status Invoked with jid=j371523542074.58295 mode=cleanup _async_dir=/root/.ansible_async
Dec 11 00:44:35 np0005554845 python3.9[58991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/os-net-config.returncode follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:44:35 np0005554845 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 00:44:35 np0005554845 python3.9[59116]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/os-net-config.returncode mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431874.851243-929-122646757432507/.source.returncode _original_basename=.t46mf9mf follow=False checksum=b6589fc6ab0dc82cf12099d1c2d40ab994e8410c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:36 np0005554845 python3.9[59268]: ansible-ansible.legacy.stat Invoked with path=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:44:37 np0005554845 python3.9[59392]: ansible-ansible.legacy.copy Invoked with dest=/etc/cloud/cloud.cfg.d/99-edpm-disable-network-config.cfg mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431876.279278-976-105771740560265/.source.cfg _original_basename=.fdndcnpn follow=False checksum=f3c5952a9cd4c6c31b314b25eb897168971cc86e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:44:38 np0005554845 python3.9[59544]: ansible-ansible.builtin.systemd Invoked with name=NetworkManager state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:44:38 np0005554845 systemd[1]: Reloading Network Manager...
Dec 11 00:44:38 np0005554845 NetworkManager[55529]: <info>  [1765431878.3107] audit: op="reload" arg="0" pid=59548 uid=0 result="success"
Dec 11 00:44:38 np0005554845 NetworkManager[55529]: <info>  [1765431878.3117] config: signal: SIGHUP,config-files,values,values-user,no-auto-default (/etc/NetworkManager/NetworkManager.conf, /usr/lib/NetworkManager/conf.d/00-server.conf, /run/NetworkManager/conf.d/15-carrier-timeout.conf, /var/lib/NetworkManager/NetworkManager-intern.conf)
Dec 11 00:44:38 np0005554845 systemd[1]: Reloaded Network Manager.
Dec 11 00:44:38 np0005554845 systemd[1]: session-13.scope: Deactivated successfully.
Dec 11 00:44:38 np0005554845 systemd[1]: session-13.scope: Consumed 51.250s CPU time.
Dec 11 00:44:38 np0005554845 systemd-logind[789]: Session 13 logged out. Waiting for processes to exit.
Dec 11 00:44:38 np0005554845 systemd-logind[789]: Removed session 13.
Dec 11 00:44:43 np0005554845 systemd-logind[789]: New session 14 of user zuul.
Dec 11 00:44:43 np0005554845 systemd[1]: Started Session 14 of User zuul.
Dec 11 00:44:44 np0005554845 python3.9[59732]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:44:45 np0005554845 python3.9[59886]: ansible-ansible.builtin.setup Invoked with filter=['ansible_default_ipv4'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:44:47 np0005554845 python3.9[60076]: ansible-ansible.legacy.command Invoked with _raw_params=hostname -f _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:44:47 np0005554845 systemd[1]: session-14.scope: Deactivated successfully.
Dec 11 00:44:47 np0005554845 systemd[1]: session-14.scope: Consumed 2.663s CPU time.
Dec 11 00:44:47 np0005554845 systemd-logind[789]: Session 14 logged out. Waiting for processes to exit.
Dec 11 00:44:47 np0005554845 systemd-logind[789]: Removed session 14.
Dec 11 00:44:48 np0005554845 systemd[1]: NetworkManager-dispatcher.service: Deactivated successfully.
Dec 11 00:44:53 np0005554845 systemd-logind[789]: New session 15 of user zuul.
Dec 11 00:44:53 np0005554845 systemd[1]: Started Session 15 of User zuul.
Dec 11 00:44:54 np0005554845 python3.9[60258]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:44:55 np0005554845 python3.9[60412]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:44:56 np0005554845 python3.9[60568]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:44:57 np0005554845 python3.9[60653]: ansible-ansible.legacy.dnf Invoked with name=['podman'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:44:59 np0005554845 python3.9[60806]: ansible-ansible.builtin.setup Invoked with filter=['ansible_interfaces'] gather_subset=['!all', '!min', 'network'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:45:00 np0005554845 python3.9[60998]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/containers/networks recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:01 np0005554845 python3.9[61150]: ansible-ansible.legacy.command Invoked with _raw_params=podman network inspect podman#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:45:01 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:45:02 np0005554845 python3.9[61312]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/networks/podman.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:03 np0005554845 python3.9[61390]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/containers/networks/podman.json _original_basename=podman_network_config.j2 recurse=False state=file path=/etc/containers/networks/podman.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:04 np0005554845 python3.9[61542]: ansible-ansible.legacy.stat Invoked with path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:04 np0005554845 python3.9[61620]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root setype=etc_t dest=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf _original_basename=registries.conf.j2 recurse=False state=file path=/etc/containers/registries.conf.d/20-edpm-podman-registries.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:05 np0005554845 python3.9[61772]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=pids_limit owner=root path=/etc/containers/containers.conf section=containers setype=etc_t value=4096 backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:06 np0005554845 python3.9[61924]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=events_logger owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="journald" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:07 np0005554845 python3.9[62076]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=runtime owner=root path=/etc/containers/containers.conf section=engine setype=etc_t value="crun" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:07 np0005554845 python3.9[62228]: ansible-community.general.ini_file Invoked with create=True group=root mode=0644 option=network_backend owner=root path=/etc/containers/containers.conf section=network setype=etc_t value="netavark" backup=False state=present exclusive=True no_extra_spaces=False ignore_spaces=False allow_no_value=False modify_inactive_option=True follow=False unsafe_writes=False section_has_values=None values=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:08 np0005554845 python3.9[62380]: ansible-ansible.legacy.dnf Invoked with name=['openssh-server'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:45:11 np0005554845 python3.9[62533]: ansible-setup Invoked with gather_subset=['!all', '!min', 'distribution', 'distribution_major_version', 'distribution_version', 'os_family'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:45:12 np0005554845 python3.9[62687]: ansible-stat Invoked with path=/run/ostree-booted follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:45:13 np0005554845 python3.9[62839]: ansible-stat Invoked with path=/sbin/transactional-update follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:45:13 np0005554845 python3.9[62991]: ansible-ansible.legacy.command Invoked with _raw_params=systemctl is-system-running _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:45:15 np0005554845 python3.9[63144]: ansible-service_facts Invoked
Dec 11 00:45:15 np0005554845 network[63161]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:45:15 np0005554845 network[63162]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:45:15 np0005554845 network[63163]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:45:24 np0005554845 python3.9[63616]: ansible-ansible.legacy.dnf Invoked with name=['chrony'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:45:27 np0005554845 python3.9[63769]: ansible-package_facts Invoked with manager=['auto'] strategy=first
Dec 11 00:45:29 np0005554845 python3.9[63921]: ansible-ansible.legacy.stat Invoked with path=/etc/chrony.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:30 np0005554845 python3.9[64046]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/chrony.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431929.1037061-659-40028608007629/.source.conf follow=False _original_basename=chrony.conf.j2 checksum=cfb003e56d02d0d2c65555452eb1a05073fecdad force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:31 np0005554845 python3.9[64200]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/chronyd follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:31 np0005554845 python3.9[64325]: ansible-ansible.legacy.copy Invoked with backup=True dest=/etc/sysconfig/chronyd mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431930.7109294-705-207995318206936/.source follow=False _original_basename=chronyd.sysconfig.j2 checksum=dd196b1ff1f915b23eebc37ec77405b5dd3df76c force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:33 np0005554845 python3.9[64479]: ansible-lineinfile Invoked with backup=True create=True dest=/etc/sysconfig/network line=PEERNTP=no mode=0644 regexp=^PEERNTP= state=present path=/etc/sysconfig/network encoding=utf-8 backrefs=False firstmatch=False unsafe_writes=False search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:35 np0005554845 python3.9[64633]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:45:36 np0005554845 python3.9[64717]: ansible-ansible.legacy.systemd Invoked with enabled=True name=chronyd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:45:38 np0005554845 python3.9[64871]: ansible-ansible.legacy.setup Invoked with gather_subset=['!all'] filter=['ansible_service_mgr'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:45:38 np0005554845 python3.9[64955]: ansible-ansible.legacy.systemd Invoked with name=chronyd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:45:38 np0005554845 chronyd[799]: chronyd exiting
Dec 11 00:45:38 np0005554845 systemd[1]: Stopping NTP client/server...
Dec 11 00:45:38 np0005554845 systemd[1]: chronyd.service: Deactivated successfully.
Dec 11 00:45:38 np0005554845 systemd[1]: Stopped NTP client/server.
Dec 11 00:45:38 np0005554845 systemd[1]: Starting NTP client/server...
Dec 11 00:45:39 np0005554845 chronyd[64963]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER +SIGND +NTS +SECHASH +IPV6 +DEBUG)
Dec 11 00:45:39 np0005554845 chronyd[64963]: Frequency -26.226 +/- 0.239 ppm read from /var/lib/chrony/drift
Dec 11 00:45:39 np0005554845 systemd[1]: Started NTP client/server.
Dec 11 00:45:39 np0005554845 chronyd[64963]: Loaded seccomp filter (level 2)
Dec 11 00:45:39 np0005554845 systemd[1]: session-15.scope: Deactivated successfully.
Dec 11 00:45:39 np0005554845 systemd[1]: session-15.scope: Consumed 27.038s CPU time.
Dec 11 00:45:39 np0005554845 systemd-logind[789]: Session 15 logged out. Waiting for processes to exit.
Dec 11 00:45:39 np0005554845 systemd-logind[789]: Removed session 15.
Dec 11 00:45:45 np0005554845 systemd-logind[789]: New session 16 of user zuul.
Dec 11 00:45:45 np0005554845 systemd[1]: Started Session 16 of User zuul.
Dec 11 00:45:46 np0005554845 python3.9[65144]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:45:47 np0005554845 python3.9[65300]: ansible-ansible.builtin.file Invoked with group=zuul mode=0770 owner=zuul path=/root/.config/containers recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:48 np0005554845 python3.9[65475]: ansible-ansible.legacy.stat Invoked with path=/root/.config/containers/auth.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:49 np0005554845 python3.9[65553]: ansible-ansible.legacy.file Invoked with group=zuul mode=0660 owner=zuul dest=/root/.config/containers/auth.json _original_basename=.lmsw2_gs recurse=False state=file path=/root/.config/containers/auth.json force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:50 np0005554845 python3.9[65705]: ansible-ansible.legacy.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:50 np0005554845 python3.9[65828]: ansible-ansible.legacy.copy Invoked with dest=/etc/sysconfig/podman_drop_in mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431949.9285033-146-179187980024118/.source _original_basename=.geozdbba follow=False checksum=125299ce8dea7711a76292961206447f0043248b backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:51 np0005554845 python3.9[65980]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:52 np0005554845 python3.9[66132]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:53 np0005554845 python3.9[66255]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-container-shutdown group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765431952.1299317-218-118151940078304/.source _original_basename=edpm-container-shutdown follow=False checksum=632c3792eb3dce4288b33ae7b265b71950d69f13 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:53 np0005554845 python3.9[66407]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:54 np0005554845 python3.9[66530]: ansible-ansible.legacy.copy Invoked with dest=/var/local/libexec/edpm-start-podman-container group=root mode=0700 owner=root setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765431953.394659-218-182216924278687/.source _original_basename=edpm-start-podman-container follow=False checksum=b963c569d75a655c0ccae95d9bb4a2a9a4df27d1 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:45:55 np0005554845 python3.9[66682]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:56 np0005554845 python3.9[66834]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:57 np0005554845 python3.9[66957]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm-container-shutdown.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431956.1279926-329-144153217589697/.source.service _original_basename=edpm-container-shutdown-service follow=False checksum=6336835cb0f888670cc99de31e19c8c071444d33 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:57 np0005554845 python3.9[67109]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:45:58 np0005554845 python3.9[67232]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431957.3862138-373-109725830772777/.source.preset _original_basename=91-edpm-container-shutdown-preset follow=False checksum=b275e4375287528cb63464dd32f622c4f142a915 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:45:59 np0005554845 python3.9[67384]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:45:59 np0005554845 systemd[1]: Reloading.
Dec 11 00:45:59 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:45:59 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:45:59 np0005554845 systemd[1]: Reloading.
Dec 11 00:46:00 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:46:00 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:46:00 np0005554845 systemd[1]: Starting EDPM Container Shutdown...
Dec 11 00:46:00 np0005554845 systemd[1]: Finished EDPM Container Shutdown.
Dec 11 00:46:01 np0005554845 python3.9[67612]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:01 np0005554845 python3.9[67735]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/netns-placeholder.service group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431960.501728-442-156172041351920/.source.service _original_basename=netns-placeholder-service follow=False checksum=b61b1b5918c20c877b8b226fbf34ff89a082d972 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:02 np0005554845 python3.9[67887]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:03 np0005554845 python3.9[68010]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system-preset/91-netns-placeholder.preset group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431961.925312-487-133479447798035/.source.preset _original_basename=91-netns-placeholder-preset follow=False checksum=28b7b9aa893525d134a1eeda8a0a48fb25b736b9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:03 np0005554845 python3.9[68162]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:46:03 np0005554845 systemd[1]: Reloading.
Dec 11 00:46:04 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:46:04 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:46:04 np0005554845 systemd[1]: Reloading.
Dec 11 00:46:04 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:46:04 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:46:04 np0005554845 systemd[1]: Starting Create netns directory...
Dec 11 00:46:04 np0005554845 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 00:46:04 np0005554845 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 00:46:04 np0005554845 systemd[1]: Finished Create netns directory.
Dec 11 00:46:05 np0005554845 python3.9[68388]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:46:05 np0005554845 network[68405]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:46:05 np0005554845 network[68406]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:46:05 np0005554845 network[68407]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:46:10 np0005554845 python3.9[68669]: ansible-ansible.builtin.systemd Invoked with enabled=False name=iptables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:46:10 np0005554845 systemd[1]: Reloading.
Dec 11 00:46:10 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:46:10 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:46:11 np0005554845 systemd[1]: Stopping IPv4 firewall with iptables...
Dec 11 00:46:11 np0005554845 iptables.init[68709]: iptables: Setting chains to policy ACCEPT: raw mangle filter nat [  OK  ]
Dec 11 00:46:11 np0005554845 iptables.init[68709]: iptables: Flushing firewall rules: [  OK  ]
Dec 11 00:46:11 np0005554845 systemd[1]: iptables.service: Deactivated successfully.
Dec 11 00:46:11 np0005554845 systemd[1]: Stopped IPv4 firewall with iptables.
Dec 11 00:46:12 np0005554845 python3.9[68906]: ansible-ansible.builtin.systemd Invoked with enabled=False name=ip6tables.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:46:13 np0005554845 python3.9[69060]: ansible-ansible.builtin.systemd Invoked with enabled=True name=nftables state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:46:13 np0005554845 systemd[1]: Reloading.
Dec 11 00:46:13 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:46:13 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:46:13 np0005554845 systemd[1]: Starting Netfilter Tables...
Dec 11 00:46:13 np0005554845 systemd[1]: Finished Netfilter Tables.
Dec 11 00:46:14 np0005554845 python3.9[69252]: ansible-ansible.legacy.command Invoked with _raw_params=nft flush ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:46:15 np0005554845 python3.9[69405]: ansible-ansible.legacy.stat Invoked with path=/etc/ssh/sshd_config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:16 np0005554845 python3.9[69530]: ansible-ansible.legacy.copy Invoked with dest=/etc/ssh/sshd_config mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431975.3558903-695-188284864593161/.source validate=/usr/sbin/sshd -T -f %s follow=False _original_basename=sshd_config_block.j2 checksum=6c79f4cb960ad444688fde322eeacb8402e22d79 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:17 np0005554845 python3.9[69683]: ansible-ansible.builtin.systemd Invoked with name=sshd state=reloaded daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:46:17 np0005554845 systemd[1]: Reloading OpenSSH server daemon...
Dec 11 00:46:17 np0005554845 systemd[1]: Reloaded OpenSSH server daemon.
Dec 11 00:46:18 np0005554845 python3.9[69839]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:19 np0005554845 python3.9[69991]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/sshd-networks.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:19 np0005554845 python3.9[70114]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/sshd-networks.yaml group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431978.5664086-787-97145242164073/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=0bfc8440fd8f39002ab90252479fb794f51b5ae8 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:20 np0005554845 python3.9[70266]: ansible-community.general.timezone Invoked with name=UTC hwclock=None
Dec 11 00:46:20 np0005554845 systemd[1]: Starting Time & Date Service...
Dec 11 00:46:20 np0005554845 systemd[1]: Started Time & Date Service.
Dec 11 00:46:21 np0005554845 python3.9[70422]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:22 np0005554845 python3.9[70574]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:23 np0005554845 python3.9[70697]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431981.9588997-892-179650964265515/.source.yaml follow=False _original_basename=base-rules.yaml.j2 checksum=450456afcafded6d4bdecceec7a02e806eebd8b3 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:23 np0005554845 python3.9[70849]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:24 np0005554845 python3.9[70972]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765431983.4246225-938-31973684907372/.source.yaml _original_basename=.reoa3gym follow=False checksum=97d170e1550eee4afc0af065b78cda302a97674c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:25 np0005554845 python3.9[71124]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:25 np0005554845 python3.9[71247]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/iptables.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431984.798681-983-2705706814878/.source.nft _original_basename=iptables.nft follow=False checksum=3e02df08f1f3ab4a513e94056dbd390e3d38fe30 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:26 np0005554845 python3.9[71399]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/iptables.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:46:27 np0005554845 python3.9[71552]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:46:28 np0005554845 python3[71705]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 00:46:29 np0005554845 python3.9[71857]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:30 np0005554845 python3.9[71980]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431989.2415843-1099-126450881192734/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:31 np0005554845 python3.9[72132]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:31 np0005554845 python3.9[72255]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431990.694277-1144-2573158465033/.source.nft follow=False _original_basename=jump-chain.j2 checksum=4c6f036d2d5808f109acc0880c19aa74ca48c961 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:32 np0005554845 python3.9[72407]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:33 np0005554845 python3.9[72530]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431992.1712732-1190-266575440221473/.source.nft follow=False _original_basename=flush-chain.j2 checksum=d16337256a56373421842284fe09e4e6c7df417e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:34 np0005554845 python3.9[72682]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:34 np0005554845 python3.9[72805]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431993.578766-1235-278686245249667/.source.nft follow=False _original_basename=chains.j2 checksum=2079f3b60590a165d1d502e763170876fc8e2984 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:35 np0005554845 python3.9[72957]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:46:36 np0005554845 python3.9[73080]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765431995.0883749-1281-84422338969156/.source.nft follow=False _original_basename=ruleset.j2 checksum=15a82a0dc61abfd6aa593407582b5b950437eb80 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:37 np0005554845 python3.9[73232]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:38 np0005554845 python3.9[73384]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:46:38 np0005554845 python3.9[73543]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:39 np0005554845 python3.9[73696]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages1G state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:40 np0005554845 python3.9[73848]: ansible-ansible.builtin.file Invoked with group=hugetlbfs mode=0775 owner=zuul path=/dev/hugepages2M state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:41 np0005554845 python3.9[74000]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=1G path=/dev/hugepages1G src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 11 00:46:41 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:46:42 np0005554845 python3.9[74154]: ansible-ansible.posix.mount Invoked with fstype=hugetlbfs opts=pagesize=2M path=/dev/hugepages2M src=none state=mounted boot=True dump=0 opts_no_log=False passno=0 backup=False fstab=None
Dec 11 00:46:43 np0005554845 systemd[1]: session-16.scope: Deactivated successfully.
Dec 11 00:46:43 np0005554845 systemd[1]: session-16.scope: Consumed 36.323s CPU time.
Dec 11 00:46:43 np0005554845 systemd-logind[789]: Session 16 logged out. Waiting for processes to exit.
Dec 11 00:46:43 np0005554845 systemd-logind[789]: Removed session 16.
Dec 11 00:46:48 np0005554845 systemd-logind[789]: New session 17 of user zuul.
Dec 11 00:46:48 np0005554845 systemd[1]: Started Session 17 of User zuul.
Dec 11 00:46:49 np0005554845 python3.9[74335]: ansible-ansible.builtin.tempfile Invoked with state=file prefix=ansible. suffix= path=None
Dec 11 00:46:50 np0005554845 python3.9[74487]: ansible-ansible.builtin.stat Invoked with path=/etc/ssh/ssh_known_hosts follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:46:50 np0005554845 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 00:46:51 np0005554845 python3.9[74641]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'ssh_host_key_rsa_public', 'ssh_host_key_ed25519_public', 'ssh_host_key_ecdsa_public'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:46:52 np0005554845 python3.9[74793]: ansible-ansible.builtin.blockinfile Invoked with block=compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDtPK17fZLjTZjlmLKlbXL10xthZpUABwGAxTBhggJv0ipZ6BX9i2PkxDt9rfoiD0pBBRr0fQ8pKyd0tpDpkAKcggJ8RbDgdE8IMguj8pPf+DFib8fumF5m7WOS89gsEBilz6GWbgWL9Cc/DB98N18mPWG/3dcX6jHiqU/dunXP3mBpEpj+1tk/MQ+2DUElJUCCpuO/F3E0Z1hsqHRQzylzg3yfSswgYEcIPlOC95lM6ebI8yAuLKo98ye746tbcHsnAHIdk+VA9zRKCLePiowHYoqm1UCqWq8/AqpwLyKpbag1m7l83ZybcrN/47GzZ4cyOtucvpKJDtgY4mmlpGhVfa6soYP6Q9oNVACyA6Znwxowz4tH/RsVI8sK8CUsa/EvdRMe48eZErjg47G6JaL6n+/iYrCNjNk2H+xN8mQdDZ5oKAezDTDq0FrY/ktil1sfxUCHvjTNPtW9H9V5wnWHrzB2e+DZ4d3leTR0RBcHi4tlBiT7mReyyCAp16lYObs=#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILtixqfq867jjKrhhJx8NNA/4YhyO+RXT3YYhNmWCNM9#012compute-0.ctlplane.example.com,192.168.122.100,compute-0* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBMgBIRhWhgLU44dawKUOyXWlCHxOFrWcA3hnlFPt/yPbw1HQHiCOkhhlkdQDqUlfbzDiCACfTVplGYq9rYhxK40=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQDHgQXgkv8B3KBtSB3lo91S2C4P90VrwF3EYiZyNqY2rngbb9u9WWCIQOHpj8x8aY31BQ8SqExUilR+2RwBEjRw2Uh7xQGtN2sPiXIiUQFJBQCIej85NgV57Axdim2Y2BkLKWF3CGtaBsaaX1RMqao1CozuybvMcYl5yMqo1oyvxHENo5T3z5vK3Ugz7ZAXqkHq7NajGFrLFrVDBvSGxAaTkcBPOghuON0kw7rcpHJxuYTDKKzmiAnxfTvjqMcb1wtOdxezSBLWgwRG2Wv3Me27gEUrgWlvSVyv7X47nnosx7qnUpUkJj83XcUzR7zGm7BoZr6Al28L/1ftNY9miK/bTmylPlKXtCrFhj5iZQ2iP6EW1nkvLBN6s6OnhQL2HfLpCi+L4dKXF1teumXZDMbg/K5AkCFLbz24GkrohckvNb5VfOGdOVfe23PEiTmEIgY8qylmIEw3qJbhg8y60OBgKyaQKE6tUJuAi0k3P3DsSNyQuOXoUOsEvjiicp+wjgs=#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAILTqOE9fRtM6SOAmPaxYLOXI2lb3b+I6y23VcC3mUULY#012compute-1.ctlplane.example.com,192.168.122.101,compute-1* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBOxdJwPXXeWUJ0sc5ZMjng2SAguGRBAy9RBzPYL9rbDPcNpwEsCX4yP3HJuq7D7wfkFFA37G9XEUG6HRj9wjkbk=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABgQCkbgt5se61b/9NDYcoKz8fpmC4MJe2OMrrhKOJvyNVTokzQI5abgng06t7tgPtu6OanI+kX8VxeFruyocSfzAqHzcD4bY+mugoRQLzC+bthKEuS4ZMIXLFe6zepITUIMmLOuFDXsvLGFToxw/zykGbx+K48lU2aRgln95MMXvWHmblywfc9Sg2uwIoWOnNRL2aSwNIavdEh1Ne2wkNtmdDRBVK2kO2pWCImff7Xo4xg3u4w5fnFdZ+eOH8lVOW8vcysYooc36uQkP2xxK/wPFAcIQ/hNZp/v/P/AzXmKgiKwR9FBk228dPsco0WPECYb6c1xDcqNgSubFcfy0gahkD2YW1o44N/uGaZzQg7QP6dQPI+2Ii/YntVedveNq3/cx9lAA/p3pV6DCeveUlMwfRoYxOun2KGaIcvwLE0kluU5+sDRJbwRQxPQjyJFLNlbdsznJiiqLdUYduj51UJn3X/Gj7ih34VyzJ6gHXnJztp6AyOqZrYISNn3XSxMX7ba8=#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ssh-ed25519 AAAAC3NzaC1lZDI1NTE5AAAAICNXNrTLO6FRy1505O7Hqvzru2fsgsGCdSaPh+ewyzSK#012compute-2.ctlplane.example.com,192.168.122.102,compute-2* ecdsa-sha2-nistp256 AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKoeLkE6B3yJw6drPclFIfZbEBEK84K4QACvmbMpYlsiTWO4x55Ue6dA+YHzZeVJFkkEn2AISMDhzJisre8CHv0=#012 create=True mode=0644 path=/tmp/ansible.wc5jlbgx state=present marker=# {mark} ANSIBLE MANAGED BLOCK backup=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:53 np0005554845 python3.9[74945]: ansible-ansible.legacy.command Invoked with _raw_params=cat '/tmp/ansible.wc5jlbgx' > /etc/ssh/ssh_known_hosts _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:46:53 np0005554845 python3.9[75099]: ansible-ansible.builtin.file Invoked with path=/tmp/ansible.wc5jlbgx state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:46:54 np0005554845 systemd-logind[789]: Session 17 logged out. Waiting for processes to exit.
Dec 11 00:46:54 np0005554845 systemd[1]: session-17.scope: Deactivated successfully.
Dec 11 00:46:54 np0005554845 systemd[1]: session-17.scope: Consumed 3.464s CPU time.
Dec 11 00:46:54 np0005554845 systemd-logind[789]: Removed session 17.
Dec 11 00:47:00 np0005554845 systemd-logind[789]: New session 18 of user zuul.
Dec 11 00:47:00 np0005554845 systemd[1]: Started Session 18 of User zuul.
Dec 11 00:47:01 np0005554845 python3.9[75277]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:47:02 np0005554845 python3.9[75433]: ansible-ansible.builtin.systemd Invoked with enabled=True name=sshd daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 11 00:47:03 np0005554845 python3.9[75587]: ansible-ansible.builtin.systemd Invoked with name=sshd state=started daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:47:04 np0005554845 python3.9[75740]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:47:05 np0005554845 python3.9[75893]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:47:06 np0005554845 python3.9[76047]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:47:07 np0005554845 python3.9[76202]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:07 np0005554845 systemd[1]: session-18.scope: Deactivated successfully.
Dec 11 00:47:07 np0005554845 systemd[1]: session-18.scope: Consumed 4.295s CPU time.
Dec 11 00:47:07 np0005554845 systemd-logind[789]: Session 18 logged out. Waiting for processes to exit.
Dec 11 00:47:07 np0005554845 systemd-logind[789]: Removed session 18.
Dec 11 00:47:13 np0005554845 systemd-logind[789]: New session 19 of user zuul.
Dec 11 00:47:13 np0005554845 systemd[1]: Started Session 19 of User zuul.
Dec 11 00:47:14 np0005554845 python3.9[76380]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:47:16 np0005554845 python3.9[76536]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:47:16 np0005554845 python3.9[76620]: ansible-ansible.legacy.dnf Invoked with name=['yum-utils'] allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None state=None
Dec 11 00:47:19 np0005554845 python3.9[76771]: ansible-ansible.legacy.command Invoked with _raw_params=needs-restarting -r _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:47:20 np0005554845 python3.9[76922]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/reboot_required/'] patterns=[] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 00:47:21 np0005554845 python3.9[77072]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:47:21 np0005554845 python3.9[77222]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/config follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:47:22 np0005554845 systemd[1]: session-19.scope: Deactivated successfully.
Dec 11 00:47:22 np0005554845 systemd[1]: session-19.scope: Consumed 5.894s CPU time.
Dec 11 00:47:22 np0005554845 systemd-logind[789]: Session 19 logged out. Waiting for processes to exit.
Dec 11 00:47:22 np0005554845 systemd-logind[789]: Removed session 19.
Dec 11 00:47:27 np0005554845 systemd-logind[789]: New session 20 of user zuul.
Dec 11 00:47:27 np0005554845 systemd[1]: Started Session 20 of User zuul.
Dec 11 00:47:28 np0005554845 python3.9[77400]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:47:29 np0005554845 python3.9[77556]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:30 np0005554845 python3.9[77708]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/libvirt/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:31 np0005554845 python3.9[77860]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:31 np0005554845 python3.9[77983]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432050.7017596-154-41205253004161/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=716f262add9b5cc7757bf8497294cc4a18a6ba6a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:32 np0005554845 python3.9[78135]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:33 np0005554845 python3.9[78258]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432052.1378098-154-209437686159021/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=7f33b8a85928e81539b2bda30dd692b146a50a62 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:33 np0005554845 python3.9[78410]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/libvirt/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:34 np0005554845 python3.9[78533]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/libvirt/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432053.2804637-154-276155699436340/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=214865cfe88239ee3b47599ce5d08007b290debc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:34 np0005554845 python3.9[78685]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:35 np0005554845 python3.9[78837]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/telemetry/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:36 np0005554845 python3.9[78989]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:36 np0005554845 python3.9[79112]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432055.7602725-324-54210085773189/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=8ba95a7ee1a37a7f2e4f2a5b68969ee44b624b1d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:37 np0005554845 python3.9[79264]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:37 np0005554845 python3.9[79387]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432056.8943524-324-271013436184846/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=4a57f9a6e47bdfda6092c0657fa8893e329de040 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:38 np0005554845 python3.9[79539]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:39 np0005554845 python3.9[79662]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/telemetry/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432058.0261204-324-74365062392534/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=4193eda9b631cbc7659ba96b00f41a08a5dee84e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:39 np0005554845 python3.9[79814]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:40 np0005554845 python3.9[79966]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/neutron-metadata/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:40 np0005554845 python3.9[80118]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:41 np0005554845 python3.9[80241]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432060.4625468-491-36125413662556/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=72b8360c807f52fd565080f11a910a65619d8866 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:42 np0005554845 python3.9[80393]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:42 np0005554845 python3.9[80516]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432061.5597274-491-154424756997762/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=cdacc8998ae838597f369a1b85d7db54e649a568 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:43 np0005554845 python3.9[80668]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/neutron-metadata/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:43 np0005554845 python3.9[80791]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/neutron-metadata/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432062.781378-491-40836005260408/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=b35cbb56f575b9f2121f7b91e0f26bd37a83bd7e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:44 np0005554845 python3.9[80943]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:45 np0005554845 python3.9[81095]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/certs/ovn/default setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:45 np0005554845 python3.9[81247]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:46 np0005554845 python3.9[81370]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432065.3081756-665-220234879322066/.source.crt _original_basename=compute-2.ctlplane.example.com-tls.crt follow=False checksum=824e0b7943f471bf8345ffd7b7fc27a8d2ef7fab backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:46 np0005554845 python3.9[81522]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/ca.crt follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:47 np0005554845 python3.9[81645]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/ca.crt group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432066.48935-665-11510388454160/.source.crt _original_basename=compute-2.ctlplane.example.com-ca.crt follow=False checksum=cdacc8998ae838597f369a1b85d7db54e649a568 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:48 np0005554845 python3.9[81797]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/certs/ovn/default/tls.key follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:48 np0005554845 chronyd[64963]: Selected source 54.39.17.239 (pool.ntp.org)
Dec 11 00:47:48 np0005554845 python3.9[81920]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/certs/ovn/default/tls.key group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432067.7136738-665-227354872345661/.source.key _original_basename=compute-2.ctlplane.example.com-tls.key follow=False checksum=e45c0b20d146bae175869cfd2ec70136007a7081 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:49 np0005554845 python3.9[82072]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/telemetry setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:50 np0005554845 python3.9[82224]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:51 np0005554845 python3.9[82347]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432070.0912375-857-163738686315697/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:51 np0005554845 python3.9[82499]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:52 np0005554845 python3.9[82651]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:53 np0005554845 python3.9[82774]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/libvirt/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432071.9299846-933-177287820383151/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:53 np0005554845 python3.9[82926]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/repo-setup setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:54 np0005554845 python3.9[83078]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:55 np0005554845 python3.9[83201]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/repo-setup/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432074.0077488-1008-80513644973153/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:55 np0005554845 python3.9[83353]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/neutron-metadata setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:56 np0005554845 python3.9[83505]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:57 np0005554845 python3.9[83628]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432076.0397005-1077-202389420610928/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:57 np0005554845 python3.9[83780]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:47:58 np0005554845 python3.9[83932]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:47:59 np0005554845 python3.9[84055]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432078.0071626-1148-125675242908839/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:47:59 np0005554845 python3.9[84207]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/bootstrap setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:00 np0005554845 python3.9[84359]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:01 np0005554845 python3.9[84482]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/bootstrap/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432079.9602191-1218-186462584526026/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:01 np0005554845 python3.9[84634]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/var/lib/openstack/cacerts/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:02 np0005554845 python3.9[84786]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:03 np0005554845 python3.9[84909]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432082.1596367-1291-207641828379404/.source.pem _original_basename=tls-ca-bundle.pem follow=False checksum=88bc5d1df1135a8eda5bcc12255c75569f113986 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:08 np0005554845 systemd-logind[789]: Session 20 logged out. Waiting for processes to exit.
Dec 11 00:48:08 np0005554845 systemd[1]: session-20.scope: Deactivated successfully.
Dec 11 00:48:08 np0005554845 systemd[1]: session-20.scope: Consumed 27.898s CPU time.
Dec 11 00:48:08 np0005554845 systemd-logind[789]: Removed session 20.
Dec 11 00:48:14 np0005554845 systemd-logind[789]: New session 21 of user zuul.
Dec 11 00:48:14 np0005554845 systemd[1]: Started Session 21 of User zuul.
Dec 11 00:48:15 np0005554845 python3.9[85087]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:48:16 np0005554845 python3.9[85243]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:17 np0005554845 python3.9[85395]: ansible-ansible.builtin.file Invoked with group=openvswitch owner=openvswitch path=/var/lib/openvswitch/ovn setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:18 np0005554845 python3.9[85545]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:48:19 np0005554845 python3.9[85697]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 11 00:48:20 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=11 res=1
Dec 11 00:48:21 np0005554845 python3.9[85853]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:48:22 np0005554845 python3.9[85937]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:48:24 np0005554845 python3.9[86090]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:48:25 np0005554845 python3[86245]: ansible-osp.edpm.edpm_nftables_snippet Invoked with content=- rule_name: 118 neutron vxlan networks#012  rule:#012    proto: udp#012    dport: 4789#012- rule_name: 119 neutron geneve networks#012  rule:#012    proto: udp#012    dport: 6081#012    state: ["UNTRACKED"]#012- rule_name: 120 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: OUTPUT#012    jump: NOTRACK#012    action: append#012    state: []#012- rule_name: 121 neutron geneve networks no conntrack#012  rule:#012    proto: udp#012    dport: 6081#012    table: raw#012    chain: PREROUTING#012    jump: NOTRACK#012    action: append#012    state: []#012 dest=/var/lib/edpm-config/firewall/ovn.yaml state=present
Dec 11 00:48:26 np0005554845 python3.9[86397]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:27 np0005554845 python3.9[86549]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:27 np0005554845 python3.9[86627]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:28 np0005554845 python3.9[86779]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:29 np0005554845 python3.9[86857]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.tg__3vfk recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:29 np0005554845 python3.9[87009]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:30 np0005554845 python3.9[87087]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:31 np0005554845 python3.9[87239]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:32 np0005554845 python3[87392]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 00:48:33 np0005554845 python3.9[87544]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:33 np0005554845 python3.9[87669]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432112.7329438-434-137075421802751/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:34 np0005554845 python3.9[87821]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:35 np0005554845 python3.9[87946]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-update-jumps.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432114.263687-479-120756294136230/.source.nft follow=False _original_basename=jump-chain.j2 checksum=81c2fc96c23335ffe374f9b064e885d5d971ddf9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:36 np0005554845 python3.9[88098]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:36 np0005554845 python3.9[88223]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-flushes.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432115.8424153-524-226375052930005/.source.nft follow=False _original_basename=flush-chain.j2 checksum=4d3ffec49c8eb1a9b80d2f1e8cd64070063a87b4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:37 np0005554845 python3.9[88375]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:38 np0005554845 python3.9[88500]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-chains.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432117.363476-568-116413056183462/.source.nft follow=False _original_basename=chains.j2 checksum=298ada419730ec15df17ded0cc50c97a4014a591 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:39 np0005554845 python3.9[88652]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:39 np0005554845 python3.9[88777]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432118.8635056-614-12063688697415/.source.nft follow=False _original_basename=ruleset.j2 checksum=eb691bdb7d792c5f8ff0d719e807fe1c95b09438 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:40 np0005554845 python3.9[88929]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:41 np0005554845 python3.9[89081]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:42 np0005554845 python3.9[89236]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:43 np0005554845 python3.9[89388]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:44 np0005554845 python3.9[89541]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:48:45 np0005554845 python3.9[89695]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:46 np0005554845 python3.9[89850]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:47 np0005554845 python3.9[90000]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'machine'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:48:48 np0005554845 python3.9[90153]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings="datacentre:1e:0a:cb:58:d7:dd" external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch #012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:48 np0005554845 ovs-vsctl[90154]: ovs|00001|vsctl|INFO|Called as ovs-vsctl set open . external_ids:hostname=compute-2.ctlplane.example.com external_ids:ovn-bridge=br-int external_ids:ovn-bridge-mappings=datacentre:br-ex external_ids:ovn-chassis-mac-mappings=datacentre:1e:0a:cb:58:d7:dd external_ids:ovn-encap-ip=172.19.0.102 external_ids:ovn-encap-type=geneve external_ids:ovn-encap-tos=0 external_ids:ovn-match-northd-version=False external_ids:ovn-monitor-all=True external_ids:ovn-remote=ssl:ovsdbserver-sb.openstack.svc:6642 external_ids:ovn-remote-probe-interval=60000 external_ids:ovn-ofctrl-wait-before-clear=8000 external_ids:rundir=/var/run/openvswitch
Dec 11 00:48:49 np0005554845 python3.9[90306]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ovs-vsctl show | grep -q "Manager"#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:50 np0005554845 python3.9[90461]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl --timeout=5 --id=@manager -- create Manager target=\"ptcp:********@manager#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:48:50 np0005554845 ovs-vsctl[90462]: ovs|00001|vsctl|INFO|Called as ovs-vsctl --timeout=5 --id=@manager -- create Manager "target=\"ptcp:6640:127.0.0.1\"" -- add Open_vSwitch . manager_options @manager
Dec 11 00:48:51 np0005554845 python3.9[90612]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:48:51 np0005554845 python3.9[90766]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:53 np0005554845 python3.9[90918]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:53 np0005554845 python3.9[90996]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:54 np0005554845 python3.9[91148]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:54 np0005554845 python3.9[91226]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:48:55 np0005554845 python3.9[91378]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:56 np0005554845 python3.9[91530]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:56 np0005554845 python3.9[91608]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:57 np0005554845 python3.9[91760]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:48:58 np0005554845 python3.9[91838]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:48:58 np0005554845 python3.9[91990]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:48:58 np0005554845 systemd[1]: Reloading.
Dec 11 00:48:59 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:48:59 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:49:00 np0005554845 python3.9[92181]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:00 np0005554845 python3.9[92259]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:01 np0005554845 python3.9[92411]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:01 np0005554845 python3.9[92489]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:03 np0005554845 python3.9[92641]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:49:03 np0005554845 systemd[1]: Reloading.
Dec 11 00:49:03 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:49:03 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:49:03 np0005554845 systemd[1]: Starting Create netns directory...
Dec 11 00:49:03 np0005554845 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 00:49:03 np0005554845 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 00:49:03 np0005554845 systemd[1]: Finished Create netns directory.
Dec 11 00:49:04 np0005554845 python3.9[92835]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:04 np0005554845 python3.9[92987]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_controller/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:05 np0005554845 python3.9[93110]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_controller/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432144.4300406-1366-26602423635792/.source _original_basename=healthcheck follow=False checksum=4098dd010265fabdf5c26b97d169fc4e575ff457 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:06 np0005554845 python3.9[93262]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:07 np0005554845 python3.9[93414]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_controller.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:08 np0005554845 python3.9[93537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_controller.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432147.3655667-1441-20117994176809/.source.json _original_basename=._kw4xpt1 follow=False checksum=2328fc98619beeb08ee32b01f15bb43094c10b61 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:09 np0005554845 python3.9[93689]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_controller state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:12 np0005554845 python3.9[94116]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_controller config_pattern=*.json debug=False
Dec 11 00:49:14 np0005554845 python3.9[94268]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:49:15 np0005554845 python3.9[94420]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 00:49:15 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:49:17 np0005554845 python3[94583]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_controller config_id=ovn_controller config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:49:17 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:49:17 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:49:17 np0005554845 podman[94618]: 2025-12-11 05:49:17.303953718 +0000 UTC m=+0.050099999 container create a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 00:49:17 np0005554845 podman[94618]: 2025-12-11 05:49:17.28004359 +0000 UTC m=+0.026189891 image pull a17927617ef5a603f0594ee0d6df65aabdc9e0303ccc5a52c36f193de33ee0fe quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 11 00:49:17 np0005554845 python3[94583]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_controller --conmon-pidfile /run/ovn_controller.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=ovn_controller --label container_name=ovn_controller --label managed_by=edpm_ansible --label config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --user root --volume /lib/modules:/lib/modules:ro --volume /run:/run --volume /var/lib/openvswitch/ovn:/run/ovn:shared,z --volume /var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified
Dec 11 00:49:18 np0005554845 systemd[1]: var-lib-containers-storage-overlay.mount: Deactivated successfully.
Dec 11 00:49:18 np0005554845 python3.9[94806]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:49:19 np0005554845 python3.9[94960]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_controller.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:20 np0005554845 python3.9[95036]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_controller_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:49:20 np0005554845 python3.9[95187]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432160.1607075-1705-252630578728271/source dest=/etc/systemd/system/edpm_ovn_controller.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:49:21 np0005554845 python3.9[95263]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:49:21 np0005554845 systemd[1]: Reloading.
Dec 11 00:49:21 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:49:21 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:49:22 np0005554845 python3.9[95374]: ansible-systemd Invoked with state=restarted name=edpm_ovn_controller.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:49:22 np0005554845 systemd[1]: Reloading.
Dec 11 00:49:22 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:49:22 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:49:22 np0005554845 systemd[1]: Starting ovn_controller container...
Dec 11 00:49:22 np0005554845 systemd[1]: Created slice Virtual Machine and Container Slice.
Dec 11 00:49:22 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:49:22 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/91a9bdc216bdc55294c62c4ea89bc4e217129b1a6b209d60c677b57e0268808d/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 00:49:22 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1.
Dec 11 00:49:22 np0005554845 podman[95414]: 2025-12-11 05:49:22.615979121 +0000 UTC m=+0.117379218 container init a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 00:49:22 np0005554845 ovn_controller[95428]: + sudo -E kolla_set_configs
Dec 11 00:49:22 np0005554845 podman[95414]: 2025-12-11 05:49:22.639957881 +0000 UTC m=+0.141357978 container start a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 00:49:22 np0005554845 edpm-start-podman-container[95414]: ovn_controller
Dec 11 00:49:22 np0005554845 systemd[1]: Created slice User Slice of UID 0.
Dec 11 00:49:22 np0005554845 systemd[1]: Starting User Runtime Directory /run/user/0...
Dec 11 00:49:22 np0005554845 systemd[1]: Finished User Runtime Directory /run/user/0.
Dec 11 00:49:22 np0005554845 systemd[1]: Starting User Manager for UID 0...
Dec 11 00:49:22 np0005554845 edpm-start-podman-container[95413]: Creating additional drop-in dependency for "ovn_controller" (a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1)
Dec 11 00:49:22 np0005554845 podman[95434]: 2025-12-11 05:49:22.711129296 +0000 UTC m=+0.059606410 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 11 00:49:22 np0005554845 systemd[1]: a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1-c137011925e66cd.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 00:49:22 np0005554845 systemd[1]: a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1-c137011925e66cd.service: Failed with result 'exit-code'.
Dec 11 00:49:22 np0005554845 systemd[1]: Reloading.
Dec 11 00:49:22 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:49:22 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:49:22 np0005554845 systemd[95472]: Queued start job for default target Main User Target.
Dec 11 00:49:22 np0005554845 systemd[95472]: Created slice User Application Slice.
Dec 11 00:49:22 np0005554845 systemd[95472]: Mark boot as successful after the user session has run 2 minutes was skipped because of an unmet condition check (ConditionUser=!@system).
Dec 11 00:49:22 np0005554845 systemd[95472]: Started Daily Cleanup of User's Temporary Directories.
Dec 11 00:49:22 np0005554845 systemd[95472]: Reached target Paths.
Dec 11 00:49:22 np0005554845 systemd[95472]: Reached target Timers.
Dec 11 00:49:22 np0005554845 systemd[95472]: Starting D-Bus User Message Bus Socket...
Dec 11 00:49:22 np0005554845 systemd[95472]: Starting Create User's Volatile Files and Directories...
Dec 11 00:49:22 np0005554845 systemd[95472]: Finished Create User's Volatile Files and Directories.
Dec 11 00:49:22 np0005554845 systemd[95472]: Listening on D-Bus User Message Bus Socket.
Dec 11 00:49:22 np0005554845 systemd[95472]: Reached target Sockets.
Dec 11 00:49:22 np0005554845 systemd[95472]: Reached target Basic System.
Dec 11 00:49:22 np0005554845 systemd[95472]: Reached target Main User Target.
Dec 11 00:49:22 np0005554845 systemd[95472]: Startup finished in 121ms.
Dec 11 00:49:22 np0005554845 systemd[1]: Started User Manager for UID 0.
Dec 11 00:49:22 np0005554845 systemd[1]: Started ovn_controller container.
Dec 11 00:49:22 np0005554845 systemd[1]: Started Session c1 of User root.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: INFO:__main__:Validating config file
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: INFO:__main__:Writing out command to execute
Dec 11 00:49:23 np0005554845 systemd[1]: session-c1.scope: Deactivated successfully.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: ++ cat /run_command
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + CMD='/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + ARGS=
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + sudo kolla_copy_cacerts
Dec 11 00:49:23 np0005554845 systemd[1]: Started Session c2 of User root.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + [[ ! -n '' ]]
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + . kolla_extend_start
Dec 11 00:49:23 np0005554845 systemd[1]: session-c2.scope: Deactivated successfully.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: Running command: '/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + echo 'Running command: '\''/usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock  -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt '\'''
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + umask 0022
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: + exec /usr/bin/ovn-controller --pidfile unix:/run/openvswitch/db.sock -p /etc/pki/tls/private/ovndb.key -c /etc/pki/tls/certs/ovndb.crt -C /etc/pki/tls/certs/ovndbca.crt
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00001|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00002|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00003|main|INFO|OVN internal version is : [24.03.8-20.33.0-76.8]
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00004|main|INFO|OVS IDL reconnected, force recompute.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00005|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00006|main|INFO|OVNSB IDL reconnected, force recompute.
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1073] manager: (br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/14)
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1081] device (br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <warn>  [1765432163.1084] device (br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1089] manager: (br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/15)
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1094] manager: (br-int): new Open vSwitch Bridge device (/org/freedesktop/NetworkManager/Devices/16)
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1098] device (br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 00:49:23 np0005554845 kernel: br-int: entered promiscuous mode
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00007|reconnect|INFO|ssl:ovsdbserver-sb.openstack.svc:6642: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00008|features|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00009|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00010|features|INFO|OVS Feature: ct_zero_snat, state: supported
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00011|features|INFO|OVS Feature: ct_flush, state: supported
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00012|features|INFO|OVS Feature: dp_hash_l4_sym_support, state: supported
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00013|reconnect|INFO|unix:/run/openvswitch/db.sock: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00014|main|INFO|OVS feature set changed, force recompute.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00015|ofctrl|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00016|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00017|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00018|reconnect|INFO|unix:/run/openvswitch/db.sock: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00019|features|INFO|OVS DB schema supports 4 flow table prefixes, our IDL supports: 4
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00020|rconn|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00021|ofctrl|INFO|ofctrl-wait-before-clear is now 8000 ms (was 0 ms)
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00022|main|INFO|OVS OpenFlow connection reconnected,force recompute.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00023|main|INFO|Setting flow table prefixes: ip_src, ip_dst, ipv6_src, ipv6_dst.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00024|main|INFO|OVS feature set changed, force recompute.
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00001|pinctrl(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00002|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00001|statctrl(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting to switch
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00002|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connecting...
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00003|rconn(ovn_pinctrl0)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 00:49:23 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:23Z|00003|rconn(ovn_statctrl3)|INFO|unix:/var/run/openvswitch/br-int.mgmt: connected
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1301] manager: (ovn-d085ee-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/17)
Dec 11 00:49:23 np0005554845 systemd-udevd[95565]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:49:23 np0005554845 kernel: genev_sys_6081: entered promiscuous mode
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1448] device (genev_sys_6081): carrier: link connected
Dec 11 00:49:23 np0005554845 systemd-udevd[95567]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.1455] manager: (genev_sys_6081): new Generic device (/org/freedesktop/NetworkManager/Devices/18)
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.3309] manager: (ovn-0d0e38-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/19)
Dec 11 00:49:23 np0005554845 NetworkManager[55529]: <info>  [1765432163.6352] manager: (ovn-7e0ed8-0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/20)
Dec 11 00:49:24 np0005554845 python3.9[95697]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove open . other_config hw-offload#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:49:24 np0005554845 ovs-vsctl[95698]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove open . other_config hw-offload
Dec 11 00:49:24 np0005554845 python3.9[95850]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl get Open_vSwitch . external_ids:ovn-cms-options | sed 's/\"//g'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:49:24 np0005554845 ovs-vsctl[95852]: ovs|00001|db_ctl_base|ERR|no key "ovn-cms-options" in Open_vSwitch record "." column external_ids
Dec 11 00:49:26 np0005554845 python3.9[96005]: ansible-ansible.legacy.command Invoked with _raw_params=ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:49:26 np0005554845 ovs-vsctl[96006]: ovs|00001|vsctl|INFO|Called as ovs-vsctl remove Open_vSwitch . external_ids ovn-cms-options
Dec 11 00:49:27 np0005554845 systemd[1]: session-21.scope: Deactivated successfully.
Dec 11 00:49:27 np0005554845 systemd[1]: session-21.scope: Consumed 44.728s CPU time.
Dec 11 00:49:27 np0005554845 systemd-logind[789]: Session 21 logged out. Waiting for processes to exit.
Dec 11 00:49:27 np0005554845 systemd-logind[789]: Removed session 21.
Dec 11 00:49:32 np0005554845 systemd-logind[789]: New session 23 of user zuul.
Dec 11 00:49:32 np0005554845 systemd[1]: Started Session 23 of User zuul.
Dec 11 00:49:33 np0005554845 systemd[1]: Stopping User Manager for UID 0...
Dec 11 00:49:33 np0005554845 systemd[95472]: Activating special unit Exit the Session...
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped target Main User Target.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped target Basic System.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped target Paths.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped target Sockets.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped target Timers.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped Daily Cleanup of User's Temporary Directories.
Dec 11 00:49:33 np0005554845 systemd[95472]: Closed D-Bus User Message Bus Socket.
Dec 11 00:49:33 np0005554845 systemd[95472]: Stopped Create User's Volatile Files and Directories.
Dec 11 00:49:33 np0005554845 systemd[95472]: Removed slice User Application Slice.
Dec 11 00:49:33 np0005554845 systemd[95472]: Reached target Shutdown.
Dec 11 00:49:33 np0005554845 systemd[95472]: Finished Exit the Session.
Dec 11 00:49:33 np0005554845 systemd[95472]: Reached target Exit the Session.
Dec 11 00:49:33 np0005554845 systemd[1]: user@0.service: Deactivated successfully.
Dec 11 00:49:33 np0005554845 systemd[1]: Stopped User Manager for UID 0.
Dec 11 00:49:33 np0005554845 systemd[1]: Stopping User Runtime Directory /run/user/0...
Dec 11 00:49:33 np0005554845 systemd[1]: run-user-0.mount: Deactivated successfully.
Dec 11 00:49:33 np0005554845 systemd[1]: user-runtime-dir@0.service: Deactivated successfully.
Dec 11 00:49:33 np0005554845 systemd[1]: Stopped User Runtime Directory /run/user/0.
Dec 11 00:49:33 np0005554845 systemd[1]: Removed slice User Slice of UID 0.
Dec 11 00:49:34 np0005554845 python3.9[96186]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:49:35 np0005554845 python3.9[96342]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:36 np0005554845 python3.9[96494]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:36 np0005554845 python3.9[96646]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/kill_scripts setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:37 np0005554845 python3.9[96798]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/ovn-metadata-proxy setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:38 np0005554845 python3.9[96950]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/neutron/external/pids setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:39 np0005554845 python3.9[97100]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'selinux'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:49:40 np0005554845 python3.9[97252]: ansible-ansible.posix.seboolean Invoked with name=virt_sandbox_use_netlink persistent=True state=True ignore_selinux_state=False
Dec 11 00:49:41 np0005554845 python3.9[97403]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/ovn_metadata_haproxy_wrapper follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:42 np0005554845 python3.9[97524]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/ovn_metadata_haproxy_wrapper mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432181.2564418-220-231083234126393/.source follow=False _original_basename=haproxy.j2 checksum=95c62e64c8f82dd9393a560d1b052dc98d38f810 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:43 np0005554845 python3.9[97674]: ansible-ansible.legacy.stat Invoked with path=/var/lib/neutron/kill_scripts/haproxy-kill follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:43 np0005554845 python3.9[97795]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/neutron/kill_scripts/haproxy-kill mode=0755 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432182.8565075-265-5328908298813/.source follow=False _original_basename=kill-script.j2 checksum=2dfb5489f491f61b95691c3bf95fa1fe48ff3700 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:45 np0005554845 python3.9[97947]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:49:46 np0005554845 python3.9[98031]: ansible-ansible.legacy.dnf Invoked with name=['openvswitch'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:49:48 np0005554845 python3.9[98184]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=openvswitch.service state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:49:49 np0005554845 python3.9[98337]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:50 np0005554845 python3.9[98458]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-rootwrap.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432188.9605894-376-189225811148580/.source.conf follow=False _original_basename=rootwrap.conf.j2 checksum=11f2cfb4b7d97b2cef3c2c2d88089e6999cffe22 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:50 np0005554845 python3.9[98608]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:51 np0005554845 python3.9[98729]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/01-neutron-ovn-metadata-agent.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432190.206796-376-125738213272266/.source.conf follow=False _original_basename=neutron-ovn-metadata-agent.conf.j2 checksum=8bc979abbe81c2cf3993a225517a7e2483e20443 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:52 np0005554845 python3.9[98880]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:53 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:53Z|00025|memory|INFO|16128 kB peak resident set size after 30.0 seconds
Dec 11 00:49:53 np0005554845 ovn_controller[95428]: 2025-12-11T05:49:53Z|00026|memory|INFO|idl-cells-OVN_Southbound:273 idl-cells-Open_vSwitch:585 ofctrl_desired_flow_usage-KB:7 ofctrl_installed_flow_usage-KB:5 ofctrl_sb_flow_ref_usage-KB:2
Dec 11 00:49:53 np0005554845 podman[98904]: 2025-12-11 05:49:53.148552611 +0000 UTC m=+0.081140227 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 00:49:53 np0005554845 python3.9[99027]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/10-neutron-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432192.2157135-508-74286031226604/.source.conf _original_basename=10-neutron-metadata.conf follow=False checksum=ca7d4d155f5b812fab1a3b70e34adb495d291b8d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:54 np0005554845 python3.9[99177]: ansible-ansible.legacy.stat Invoked with path=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:54 np0005554845 python3.9[99298]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent/05-nova-metadata.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432193.6320238-508-143813876729905/.source.conf _original_basename=05-nova-metadata.conf follow=False checksum=a14d6b38898a379cd37fc0bf365d17f10859446f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:55 np0005554845 python3.9[99448]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:49:56 np0005554845 python3.9[99602]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:57 np0005554845 python3.9[99754]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:57 np0005554845 python3.9[99832]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:58 np0005554845 python3.9[99984]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:49:58 np0005554845 python3.9[100062]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:49:59 np0005554845 python3.9[100214]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:00 np0005554845 python3.9[100366]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:01 np0005554845 python3.9[100444]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:01 np0005554845 python3.9[100596]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:02 np0005554845 python3.9[100674]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:03 np0005554845 python3.9[100826]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:03 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:03 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:03 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:04 np0005554845 python3.9[101017]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:04 np0005554845 python3.9[101095]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:05 np0005554845 python3.9[101247]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:06 np0005554845 python3.9[101325]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:07 np0005554845 python3.9[101477]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:07 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:07 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:07 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:07 np0005554845 systemd[1]: Starting Create netns directory...
Dec 11 00:50:07 np0005554845 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 00:50:07 np0005554845 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 00:50:07 np0005554845 systemd[1]: Finished Create netns directory.
Dec 11 00:50:08 np0005554845 python3.9[101670]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:50:09 np0005554845 python3.9[101822]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ovn_metadata_agent/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:09 np0005554845 python3.9[101945]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ovn_metadata_agent/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432208.8255892-961-78515348701828/.source _original_basename=healthcheck follow=False checksum=898a5a1fcd473cf731177fc866e3bd7ebf20a131 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:50:11 np0005554845 python3.9[102097]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:50:11 np0005554845 python3.9[102249]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/ovn_metadata_agent.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:50:12 np0005554845 python3.9[102372]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/ovn_metadata_agent.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432211.4224184-1036-91933236768417/.source.json _original_basename=.3sx_ifd5 follow=False checksum=a908ef151ded3a33ae6c9ac8be72a35e5e33b9dc backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:13 np0005554845 python3.9[102524]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:15 np0005554845 python3.9[102951]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_pattern=*.json debug=False
Dec 11 00:50:16 np0005554845 python3.9[103103]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:50:17 np0005554845 python3.9[103255]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 00:50:19 np0005554845 python3[103433]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/ovn_metadata_agent config_id=ovn_metadata_agent config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:50:19 np0005554845 podman[103472]: 2025-12-11 05:50:19.639286739 +0000 UTC m=+0.062657085 container create 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ovn_metadata_agent)
Dec 11 00:50:19 np0005554845 podman[103472]: 2025-12-11 05:50:19.604763481 +0000 UTC m=+0.028133917 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 00:50:19 np0005554845 python3[103433]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ovn_metadata_agent --cgroupns=host --conmon-pidfile /run/ovn_metadata_agent.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env EDPM_CONFIG_HASH=0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d --healthcheck-command /openstack/healthcheck --label config_id=ovn_metadata_agent --label container_name=ovn_metadata_agent --label managed_by=edpm_ansible --label config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']} --log-driver journald --log-level info --network host --pid host --privileged=True --user root --volume /run/openvswitch:/run/openvswitch:z --volume /var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z --volume /run/netns:/run/netns:shared --volume /var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro --volume /var/lib/neutron:/var/lib/neutron:shared,z --volume /var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro --volume /var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro --volume /var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z --volume /var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z --volume /var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 00:50:20 np0005554845 python3.9[103662]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:50:21 np0005554845 python3.9[103816]: ansible-file Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:21 np0005554845 python3.9[103892]: ansible-stat Invoked with path=/etc/systemd/system/edpm_ovn_metadata_agent_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:50:22 np0005554845 python3.9[104044]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432221.822706-1300-266968731224846/source dest=/etc/systemd/system/edpm_ovn_metadata_agent.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:23 np0005554845 python3.9[104120]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:50:23 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:23 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:23 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:23 np0005554845 podman[104157]: 2025-12-11 05:50:23.369401682 +0000 UTC m=+0.102591048 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller)
Dec 11 00:50:23 np0005554845 python3.9[104259]: ansible-systemd Invoked with state=restarted name=edpm_ovn_metadata_agent.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:23 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:23 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:23 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:24 np0005554845 systemd[1]: Starting ovn_metadata_agent container...
Dec 11 00:50:24 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:50:24 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01388fea5fe8fb3a30266e3a9b2342f4ec3d94ff72da9018c9b812047b9c0540/merged/etc/neutron.conf.d supports timestamps until 2038 (0x7fffffff)
Dec 11 00:50:24 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/01388fea5fe8fb3a30266e3a9b2342f4ec3d94ff72da9018c9b812047b9c0540/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 00:50:24 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0.
Dec 11 00:50:24 np0005554845 podman[104300]: 2025-12-11 05:50:24.297834985 +0000 UTC m=+0.127572699 container init 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + sudo -E kolla_set_configs
Dec 11 00:50:24 np0005554845 podman[104300]: 2025-12-11 05:50:24.322005355 +0000 UTC m=+0.151743079 container start 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:50:24 np0005554845 edpm-start-podman-container[104300]: ovn_metadata_agent
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Validating config file
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Copying service configuration files
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Deleting /etc/neutron/rootwrap.conf
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Copying /etc/neutron.conf.d/01-rootwrap.conf to /etc/neutron/rootwrap.conf
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /etc/neutron/rootwrap.conf
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Writing out command to execute
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/ovn-metadata-proxy
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/external
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/ovn_metadata_haproxy_wrapper
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/kill_scripts/haproxy-kill
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: INFO:__main__:Setting permission for /var/lib/neutron/external/pids
Dec 11 00:50:24 np0005554845 podman[104322]: 2025-12-11 05:50:24.39026687 +0000 UTC m=+0.056749516 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: ++ cat /run_command
Dec 11 00:50:24 np0005554845 edpm-start-podman-container[104299]: Creating additional drop-in dependency for "ovn_metadata_agent" (63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0)
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + CMD=neutron-ovn-metadata-agent
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + ARGS=
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + sudo kolla_copy_cacerts
Dec 11 00:50:24 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + [[ ! -n '' ]]
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + . kolla_extend_start
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + echo 'Running command: '\''neutron-ovn-metadata-agent'\'''
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: Running command: 'neutron-ovn-metadata-agent'
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + umask 0022
Dec 11 00:50:24 np0005554845 ovn_metadata_agent[104315]: + exec neutron-ovn-metadata-agent
Dec 11 00:50:24 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:24 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:24 np0005554845 systemd[1]: Started ovn_metadata_agent container.
Dec 11 00:50:25 np0005554845 systemd-logind[789]: Session 23 logged out. Waiting for processes to exit.
Dec 11 00:50:25 np0005554845 systemd[1]: session-23.scope: Deactivated successfully.
Dec 11 00:50:25 np0005554845 systemd[1]: session-23.scope: Consumed 34.902s CPU time.
Dec 11 00:50:25 np0005554845 systemd-logind[789]: Removed session 23.
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.165 104320 INFO neutron.common.config [-] Logging enabled!#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.166 104320 INFO neutron.common.config [-] /usr/bin/neutron-ovn-metadata-agent version 22.2.2.dev43#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.166 104320 DEBUG neutron.common.config [-] command line: /usr/bin/neutron-ovn-metadata-agent setup_logging /usr/lib/python3.9/site-packages/neutron/common/config.py:123#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.166 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.166 104320 DEBUG neutron.agent.ovn.metadata_agent [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.166 104320 DEBUG neutron.agent.ovn.metadata_agent [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.167 104320 DEBUG neutron.agent.ovn.metadata_agent [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.168 104320 DEBUG neutron.agent.ovn.metadata_agent [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.169 104320 DEBUG neutron.agent.ovn.metadata_agent [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.170 104320 DEBUG neutron.agent.ovn.metadata_agent [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.171 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.172 104320 DEBUG neutron.agent.ovn.metadata_agent [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.173 104320 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.174 104320 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.175 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.176 104320 DEBUG neutron.agent.ovn.metadata_agent [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.177 104320 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.178 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.179 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.180 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.181 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.182 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.183 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.184 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.185 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.186 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.187 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.188 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.189 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.190 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.191 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.192 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.193 104320 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.194 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.195 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.196 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.197 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.198 104320 DEBUG neutron.agent.ovn.metadata_agent [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.198 104320 DEBUG neutron.agent.ovn.metadata_agent [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.206 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.206 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.206 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.206 104320 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connecting...#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.207 104320 INFO ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: connected#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.218 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Loaded chassis name 3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459 (UUID: 3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459) and ovn bridge br-int. _load_config /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:309#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.242 104320 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.243 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.243 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.243 104320 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Chassis_Private.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.247 104320 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.252 104320 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.257 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched CREATE: ChassisPrivateCreateEvent(events=('create',), table='Chassis_Private', conditions=(('name', '=', '3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459'),), old_conditions=None), priority=20 to row=Chassis_Private(chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], external_ids={}, name=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, nb_cfg_timestamp=1765432171134, nb_cfg=1) old= matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.258 104320 DEBUG neutron_lib.callbacks.manager [-] Subscribe: <bound method MetadataProxyHandler.post_fork_initialize of <neutron.agent.ovn.metadata.server.MetadataProxyHandler object at 0x7f1b902e0b80>> process after_init 55550000, False subscribe /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:52#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.258 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.258 104320 DEBUG oslo_concurrency.lockutils [-] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.259 104320 DEBUG oslo_concurrency.lockutils [-] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.259 104320 INFO oslo_service.service [-] Starting 1 workers#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.263 104320 DEBUG oslo_service.service [-] Started child 104428 _start_child /usr/lib/python3.9/site-packages/oslo_service/service.py:575#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.265 104320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.namespace_cmd', '--privsep_sock_path', '/tmp/tmp9kwylrtu/privsep.sock']#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.267 104428 DEBUG neutron_lib.callbacks.manager [-] Publish callbacks ['neutron.agent.ovn.metadata.server.MetadataProxyHandler.post_fork_initialize-503440'] for process (None), after_init _notify_loop /usr/lib/python3.9/site-packages/neutron_lib/callbacks/manager.py:184#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.294 104428 INFO neutron.agent.ovn.metadata.ovsdb [-] Getting OvsdbSbOvnIdl for MetadataAgent with retry#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.294 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created lookup_table index Chassis.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:87#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.295 104428 DEBUG ovsdbapp.backend.ovs_idl [-] Created schema index Datapath_Binding.tunnel_key autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.298 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connecting...#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.304 104428 INFO ovsdbapp.backend.ovs_idl.vlog [-] ssl:ovsdbserver-sb.openstack.svc:6642: connected#033[00m
Dec 11 00:50:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.309 104428 INFO eventlet.wsgi.server [-] (104428) wsgi starting up on http:/var/lib/neutron/metadata_proxy#033[00m
Dec 11 00:50:26 np0005554845 kernel: capability: warning: `privsep-helper' uses deprecated v2 capabilities in a way that may be insecure
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.006 104320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.007 104320 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmp9kwylrtu/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.802 104433 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.807 104433 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.809 104433 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_SYS_ADMIN/CAP_SYS_ADMIN/none#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:26.809 104433 INFO oslo.privsep.daemon [-] privsep daemon running as pid 104433#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.010 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[aad48c90-4af0-489c-8d3b-abb6eb234bec]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.494 104433 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.494 104433 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:50:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:27.494 104433 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.006 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[b754b527-6eab-4325-80a6-36d72d05d411]: (4, []) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.008 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbAddCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, column=external_ids, values=({'neutron:ovn-metadata-id': '46401acc-b9af-5f59-a0c9-4877598dfa1f'},)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.038 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-bridge': 'br-int'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] Full set of CONF: wait /usr/lib/python3.9/site-packages/oslo_service/service.py:649#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] config files: ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.046 104320 DEBUG oslo_service.service [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] agent_down_time                = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] allow_bulk                     = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] api_extensions_path            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] api_paste_config               = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] api_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.047 104320 DEBUG oslo_service.service [-] auth_ca_cert                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] auth_strategy                  = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] backlog                        = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] base_mac                       = fa:16:3e:00:00:00 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] bind_host                      = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] bind_port                      = 9696 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] client_socket_timeout          = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.048 104320 DEBUG oslo_service.service [-] config_dir                     = ['/etc/neutron.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] config_file                    = ['/etc/neutron/neutron.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] control_exchange               = neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] core_plugin                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.049 104320 DEBUG oslo_service.service [-] default_availability_zones     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'OFPHandler=INFO', 'OfctlService=INFO', 'os_ken.base.app_manager=INFO', 'os_ken.controller.controller=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] dhcp_agent_notification        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] dhcp_lease_duration            = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] dhcp_load_type                 = networks log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] dns_domain                     = openstacklocal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] enable_new_agents              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.050 104320 DEBUG oslo_service.service [-] enable_traditional_dhcp        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] external_dns_driver            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] external_pids                  = /var/lib/neutron/external/pids log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] filter_validation              = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] global_physnet_mtu             = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.051 104320 DEBUG oslo_service.service [-] host                           = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] http_retries                   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] ipam_driver                    = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] ipv6_pd_enabled                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.052 104320 DEBUG oslo_service.service [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.053 104320 DEBUG oslo_service.service [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] max_dns_nameservers            = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] max_header_line                = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.054 104320 DEBUG oslo_service.service [-] max_subnet_host_routes         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_backlog               = 4096 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_proxy_group           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_proxy_shared_secret   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_proxy_socket          = /var/lib/neutron/metadata_proxy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_proxy_socket_mode     = deduce log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_proxy_user            =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.055 104320 DEBUG oslo_service.service [-] metadata_workers               = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] network_link_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] notify_nova_on_port_data_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] notify_nova_on_port_status_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] nova_client_cert               =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] nova_client_priv_key           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] nova_metadata_host             = nova-metadata-internal.openstack.svc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.056 104320 DEBUG oslo_service.service [-] nova_metadata_insecure         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] nova_metadata_port             = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] nova_metadata_protocol         = https log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] pagination_max_limit           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] periodic_fuzzy_delay           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] periodic_interval              = 40 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.057 104320 DEBUG oslo_service.service [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] retry_until_window             = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] rpc_resources_processing_step  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] rpc_response_max_timeout       = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] rpc_state_report_workers       = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] rpc_workers                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.058 104320 DEBUG oslo_service.service [-] send_events_interval           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] service_plugins                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] setproctitle                   = on log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] state_path                     = /var/lib/neutron log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] syslog_log_facility            = syslog log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] tcp_keepidle                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.059 104320 DEBUG oslo_service.service [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] use_ssl                        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] vlan_transparent               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] wsgi_default_pool_size         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.060 104320 DEBUG oslo_service.service [-] wsgi_keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] wsgi_log_format                = %(client_ip)s "%(request_line)s" status: %(status_code)s  len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] wsgi_server_debug              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] oslo_concurrency.lock_path     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] profiler.connection_string     = messaging:// log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.061 104320 DEBUG oslo_service.service [-] profiler.enabled               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.es_doc_type           = notification log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.es_scroll_size        = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.es_scroll_time        = 2m log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.filter_error_trace    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.hmac_keys             = SECRET_KEY log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.sentinel_service_name = mymaster log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.062 104320 DEBUG oslo_service.service [-] profiler.socket_timeout        = 0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] profiler.trace_sqlalchemy      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.enforce_new_defaults = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.enforce_scope      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.063 104320 DEBUG oslo_service.service [-] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.064 104320 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] service_providers.service_provider = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] privsep.capabilities           = [21, 12, 1, 2, 19] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] privsep.group                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.065 104320 DEBUG oslo_service.service [-] privsep.helper_command         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep.logger_name            = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep.thread_pool_size       = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep.user                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.group     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.066 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_dhcp_release.user      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.capabilities = [21, 12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.067 104320 DEBUG oslo_service.service [-] privsep_ovs_vsctl.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.capabilities = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_namespace.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_conntrack.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_conntrack.group        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.068 104320 DEBUG oslo_service.service [-] privsep_conntrack.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_conntrack.logger_name  = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_conntrack.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_conntrack.user         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_link.capabilities      = [12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_link.group             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_link.helper_command    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.069 104320 DEBUG oslo_service.service [-] privsep_link.logger_name       = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] privsep_link.thread_pool_size  = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] privsep_link.user              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.check_child_processes_action = respawn log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.check_child_processes_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.comment_iptables_rules   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.debug_iptables_rules     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.kill_scripts_path        = /etc/neutron/kill_scripts/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.070 104320 DEBUG oslo_service.service [-] AGENT.root_helper              = sudo neutron-rootwrap /etc/neutron/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] AGENT.root_helper_daemon       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] AGENT.use_helper_for_ns_read   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] AGENT.use_random_fully         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] QUOTAS.default_quota           = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] QUOTAS.quota_driver            = neutron.db.quota.driver_nolock.DbQuotaNoLockDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.071 104320 DEBUG oslo_service.service [-] QUOTAS.quota_network           = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] QUOTAS.quota_port              = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] QUOTAS.quota_security_group    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] QUOTAS.quota_security_group_rule = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] QUOTAS.quota_subnet            = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] QUOTAS.track_quota_usage       = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] nova.auth_section              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.072 104320 DEBUG oslo_service.service [-] nova.auth_type                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.cafile                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.certfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.collect_timing            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.endpoint_type             = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.insecure                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.keyfile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.073 104320 DEBUG oslo_service.service [-] nova.region_name               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] nova.split_loggers             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] nova.timeout                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.auth_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.074 104320 DEBUG oslo_service.service [-] placement.endpoint_type        = public log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] placement.region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.075 104320 DEBUG oslo_service.service [-] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.enable_notifications    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.076 104320 DEBUG oslo_service.service [-] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.interface               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.service_type            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.077 104320 DEBUG oslo_service.service [-] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ironic.valid_interfaces        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] cli_script.dry_run             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.078 104320 DEBUG oslo_service.service [-] ovn.allow_stateless_action_supported = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.dhcp_default_lease_time    = 43200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.disable_ovn_dhcp_for_baremetal_ports = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.dns_servers                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.enable_distributed_floating_ip = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.neutron_sync_mode          = log log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.ovn_dhcp4_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.ovn_dhcp6_global_options   = {} log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.079 104320 DEBUG oslo_service.service [-] ovn.ovn_emit_need_to_frag      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_l3_mode                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_l3_scheduler           = leastloaded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_metadata_enabled       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_nb_ca_cert             =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_nb_certificate         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_nb_connection          = tcp:127.0.0.1:6641 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.080 104320 DEBUG oslo_service.service [-] ovn.ovn_nb_private_key         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovn_sb_ca_cert             = /etc/pki/tls/certs/ovndbca.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovn_sb_certificate         = /etc/pki/tls/certs/ovndb.crt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovn_sb_connection          = ssl:ovsdbserver-sb.openstack.svc:6642 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovn_sb_private_key         = /etc/pki/tls/private/ovndb.key log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovsdb_log_level            = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovsdb_probe_interval       = 60000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.081 104320 DEBUG oslo_service.service [-] ovn.ovsdb_retry_max_interval   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] ovn.vhost_sock_dir             = /var/run/openvswitch log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] ovn.vif_type                   = ovs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] OVS.bridge_mac_table_size      = 50000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] OVS.igmp_snooping_enable       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] OVS.ovsdb_timeout              = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] ovs.ovsdb_connection           = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.082 104320 DEBUG oslo_service.service [-] ovs.ovsdb_connection_timeout   = 180 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.083 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.084 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.085 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.086 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_notifications.driver = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:50:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:50:28.087 104320 DEBUG oslo_service.service [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 00:50:31 np0005554845 systemd-logind[789]: New session 24 of user zuul.
Dec 11 00:50:31 np0005554845 systemd[1]: Started Session 24 of User zuul.
Dec 11 00:50:32 np0005554845 python3.9[104591]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:50:33 np0005554845 python3.9[104747]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps -a --filter name=^nova_virtlogd$ --format \{\{.Names\}\} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:50:35 np0005554845 python3.9[104910]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:50:35 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:35 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:35 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:50:36 np0005554845 python3.9[105095]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:50:36 np0005554845 network[105112]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:50:36 np0005554845 network[105113]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:50:36 np0005554845 network[105114]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:50:41 np0005554845 python3.9[105376]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_libvirt.target state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:41 np0005554845 python3.9[105529]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtlogd_wrapper.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:42 np0005554845 python3.9[105682]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtnodedevd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:43 np0005554845 python3.9[105835]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtproxyd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:44 np0005554845 python3.9[105988]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtqemud.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:45 np0005554845 python3.9[106141]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtsecretd.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:45 np0005554845 python3.9[106294]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_virtstoraged.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:50:47 np0005554845 python3.9[106447]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:47 np0005554845 python3.9[106599]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:48 np0005554845 python3.9[106751]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:49 np0005554845 python3.9[106903]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:49 np0005554845 python3.9[107055]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:50 np0005554845 python3.9[107207]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:50 np0005554845 python3.9[107359]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:52 np0005554845 python3.9[107511]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_libvirt.target state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:52 np0005554845 python3.9[107663]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtlogd_wrapper.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:53 np0005554845 python3.9[107815]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtnodedevd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:54 np0005554845 podman[107939]: 2025-12-11 05:50:54.059490077 +0000 UTC m=+0.130618312 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 00:50:54 np0005554845 python3.9[107977]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtproxyd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:54 np0005554845 podman[108118]: 2025-12-11 05:50:54.642664291 +0000 UTC m=+0.078926653 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 00:50:54 np0005554845 python3.9[108166]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtqemud.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:55 np0005554845 python3.9[108318]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtsecretd.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:56 np0005554845 python3.9[108470]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_virtstoraged.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:50:57 np0005554845 python3.9[108624]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:50:58 np0005554845 python3.9[108776]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 00:50:59 np0005554845 python3.9[108928]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:50:59 np0005554845 systemd[1]: Reloading.
Dec 11 00:50:59 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:50:59 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:51:00 np0005554845 python3.9[109115]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_libvirt.target _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:01 np0005554845 python3.9[109268]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtlogd_wrapper.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:01 np0005554845 python3.9[109421]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtnodedevd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:02 np0005554845 python3.9[109574]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtproxyd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:03 np0005554845 python3.9[109727]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtqemud.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:03 np0005554845 python3.9[109880]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtsecretd.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:04 np0005554845 python3.9[110033]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_virtstoraged.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:51:06 np0005554845 python3.9[110186]: ansible-ansible.builtin.getent Invoked with database=passwd key=libvirt fail_key=True service=None split=None
Dec 11 00:51:06 np0005554845 python3.9[110339]: ansible-ansible.builtin.group Invoked with gid=42473 name=libvirt state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:51:08 np0005554845 python3.9[110497]: ansible-ansible.builtin.user Invoked with comment=libvirt user group=libvirt groups=[''] name=libvirt shell=/sbin/nologin state=present uid=42473 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 00:51:09 np0005554845 python3.9[110657]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:51:10 np0005554845 python3.9[110741]: ansible-ansible.legacy.dnf Invoked with name=['libvirt ', 'libvirt-admin ', 'libvirt-client ', 'libvirt-daemon ', 'qemu-kvm', 'qemu-img', 'libguestfs', 'libseccomp', 'swtpm', 'swtpm-tools', 'edk2-ovmf', 'ceph-common', 'cyrus-sasl-scram'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:51:25 np0005554845 podman[110925]: 2025-12-11 05:51:25.177143142 +0000 UTC m=+0.103507663 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 00:51:25 np0005554845 podman[110926]: 2025-12-11 05:51:25.183286471 +0000 UTC m=+0.114304100 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 00:51:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:51:26.199 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:51:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:51:26.200 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:51:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:51:26.201 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:51:38 np0005554845 kernel: SELinux:  Converting 2759 SID table entries...
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:51:38 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  Converting 2759 SID table entries...
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:51:48 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:51:56 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=13 res=1
Dec 11 00:51:56 np0005554845 podman[110991]: 2025-12-11 05:51:56.147516444 +0000 UTC m=+0.068638622 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 11 00:51:56 np0005554845 podman[110992]: 2025-12-11 05:51:56.179731852 +0000 UTC m=+0.100845940 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 00:52:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:52:26.201 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:52:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:52:26.202 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:52:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:52:26.202 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:52:26 np0005554845 podman[127822]: 2025-12-11 05:52:26.300092096 +0000 UTC m=+0.056146737 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 11 00:52:26 np0005554845 podman[127823]: 2025-12-11 05:52:26.322544884 +0000 UTC m=+0.078617805 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 00:52:39 np0005554845 kernel: SELinux:  Converting 2760 SID table entries...
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability network_peer_controls=1
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability open_perms=1
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability extended_socket_class=1
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability always_check_network=0
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability cgroup_seclabel=1
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability nnp_nosuid_transition=1
Dec 11 00:52:39 np0005554845 kernel: SELinux:  policy capability genfs_seclabel_symlinks=1
Dec 11 00:52:41 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:52:41 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=14 res=1
Dec 11 00:52:41 np0005554845 dbus-broker-launch[767]: Noticed file-system modification, trigger reload.
Dec 11 00:52:49 np0005554845 systemd[1]: Stopping OpenSSH server daemon...
Dec 11 00:52:49 np0005554845 systemd[1]: sshd.service: Deactivated successfully.
Dec 11 00:52:49 np0005554845 systemd[1]: Stopped OpenSSH server daemon.
Dec 11 00:52:49 np0005554845 systemd[1]: sshd.service: Consumed 1.719s CPU time, read 564.0K from disk, written 0B to disk.
Dec 11 00:52:49 np0005554845 systemd[1]: Stopped target sshd-keygen.target.
Dec 11 00:52:49 np0005554845 systemd[1]: Stopping sshd-keygen.target...
Dec 11 00:52:49 np0005554845 systemd[1]: OpenSSH ecdsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:52:49 np0005554845 systemd[1]: OpenSSH ed25519 Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:52:49 np0005554845 systemd[1]: OpenSSH rsa Server Key Generation was skipped because of an unmet condition check (ConditionPathExists=!/run/systemd/generator.early/multi-user.target.wants/cloud-init.target).
Dec 11 00:52:49 np0005554845 systemd[1]: Reached target sshd-keygen.target.
Dec 11 00:52:49 np0005554845 systemd[1]: Starting OpenSSH server daemon...
Dec 11 00:52:49 np0005554845 systemd[1]: Started OpenSSH server daemon.
Dec 11 00:52:51 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:52:51 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:52:51 np0005554845 systemd[1]: Reloading.
Dec 11 00:52:51 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:52:51 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:52:51 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:52:57 np0005554845 podman[134553]: 2025-12-11 05:52:57.161870081 +0000 UTC m=+0.087420799 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 11 00:52:57 np0005554845 podman[134574]: 2025-12-11 05:52:57.181093603 +0000 UTC m=+0.104380710 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 00:53:00 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:53:00 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:53:00 np0005554845 systemd[1]: man-db-cache-update.service: Consumed 11.070s CPU time.
Dec 11 00:53:00 np0005554845 systemd[1]: run-r18ed4c60e5964f1c8aabed804bb9c1d6.service: Deactivated successfully.
Dec 11 00:53:00 np0005554845 python3.9[137492]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:53:00 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:00 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:00 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:01 np0005554845 python3.9[137683]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:53:01 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:01 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:01 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:02 np0005554845 python3.9[137873]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=libvirtd-tls.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:53:02 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:02 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:02 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:04 np0005554845 python3.9[138063]: ansible-ansible.builtin.systemd Invoked with enabled=False masked=True name=virtproxyd-tcp.socket state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:53:04 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:04 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:04 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:05 np0005554845 python3.9[138254]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:05 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:05 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:05 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:06 np0005554845 python3.9[138444]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:06 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:06 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:06 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:07 np0005554845 python3.9[138635]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:07 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:07 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:07 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:08 np0005554845 python3.9[138826]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:09 np0005554845 python3.9[138981]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.service daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:09 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:09 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:09 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:11 np0005554845 python3.9[139171]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-tls.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None
Dec 11 00:53:11 np0005554845 systemd[1]: Reloading.
Dec 11 00:53:11 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:53:11 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:53:11 np0005554845 systemd[1]: Listening on libvirt proxy daemon socket.
Dec 11 00:53:11 np0005554845 systemd[1]: Listening on libvirt proxy daemon TLS IP socket.
Dec 11 00:53:12 np0005554845 python3.9[139364]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:13 np0005554845 python3.9[139519]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtlogd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:13 np0005554845 python3.9[139674]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:14 np0005554845 python3.9[139829]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:15 np0005554845 python3.9[139984]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtnodedevd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:16 np0005554845 python3.9[140139]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:17 np0005554845 python3.9[140294]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:18 np0005554845 python3.9[140449]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtproxyd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:18 np0005554845 python3.9[140604]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:19 np0005554845 python3.9[140759]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:20 np0005554845 python3.9[140914]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtqemud-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:21 np0005554845 python3.9[141069]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:22 np0005554845 python3.9[141224]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-ro.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:24 np0005554845 python3.9[141379]: ansible-ansible.builtin.systemd Invoked with enabled=True masked=False name=virtsecretd-admin.socket daemon_reload=False daemon_reexec=False scope=system no_block=False state=None force=None
Dec 11 00:53:25 np0005554845 python3.9[141534]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/etc/tmpfiles.d/ setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:53:26.203 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:53:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:53:26.205 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:53:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:53:26.205 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:53:26 np0005554845 python3.9[141686]: ansible-ansible.builtin.file Invoked with group=root owner=root path=/var/lib/edpm-config/firewall setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:27 np0005554845 python3.9[141838]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:27 np0005554845 podman[141962]: 2025-12-11 05:53:27.648329797 +0000 UTC m=+0.074599569 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 00:53:27 np0005554845 podman[141963]: 2025-12-11 05:53:27.675302376 +0000 UTC m=+0.103665217 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 00:53:27 np0005554845 python3.9[142018]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/libvirt/private setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:28 np0005554845 python3.9[142188]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/pki/CA setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:29 np0005554845 python3.9[142340]: ansible-ansible.builtin.file Invoked with group=qemu owner=root path=/etc/pki/qemu setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:53:30 np0005554845 python3.9[142492]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtlogd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:31 np0005554845 python3.9[142617]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtlogd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432409.5656486-1624-93958477651201/.source.conf follow=False _original_basename=virtlogd.conf checksum=d7a72ae92c2c205983b029473e05a6aa4c58ec24 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:31 np0005554845 python3.9[142769]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtnodedevd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:32 np0005554845 python3.9[142894]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtnodedevd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432411.2181332-1624-70797349812308/.source.conf follow=False _original_basename=virtnodedevd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:32 np0005554845 python3.9[143046]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtproxyd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:33 np0005554845 python3.9[143171]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtproxyd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432412.3971097-1624-49859807083458/.source.conf follow=False _original_basename=virtproxyd.conf checksum=28bc484b7c9988e03de49d4fcc0a088ea975f716 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:34 np0005554845 python3.9[143323]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtqemud.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:34 np0005554845 python3.9[143448]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtqemud.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432413.5540578-1624-224622857681011/.source.conf follow=False _original_basename=virtqemud.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:35 np0005554845 python3.9[143600]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/qemu.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:36 np0005554845 python3.9[143725]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/qemu.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432414.95035-1624-162560569962588/.source.conf follow=False _original_basename=qemu.conf.j2 checksum=c44de21af13c90603565570f09ff60c6a41ed8df backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:36 np0005554845 python3.9[143877]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/virtsecretd.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:37 np0005554845 python3.9[144002]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/virtsecretd.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432416.3264992-1624-182415225019837/.source.conf follow=False _original_basename=virtsecretd.conf checksum=7a604468adb2868f1ab6ebd0fd4622286e6373e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:37 np0005554845 python3.9[144154]: ansible-ansible.legacy.stat Invoked with path=/etc/libvirt/auth.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:38 np0005554845 python3.9[144277]: ansible-ansible.legacy.copy Invoked with dest=/etc/libvirt/auth.conf group=libvirt mode=0600 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432417.5102172-1624-41013994036001/.source.conf follow=False _original_basename=auth.conf checksum=a94cd818c374cec2c8425b70d2e0e2f41b743ae4 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:39 np0005554845 python3.9[144429]: ansible-ansible.legacy.stat Invoked with path=/etc/sasl2/libvirt.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:39 np0005554845 python3.9[144554]: ansible-ansible.legacy.copy Invoked with dest=/etc/sasl2/libvirt.conf group=libvirt mode=0640 owner=libvirt src=/home/zuul/.ansible/tmp/ansible-tmp-1765432418.7313316-1624-157476411623934/.source.conf follow=False _original_basename=sasl_libvirt.conf checksum=652e4d404bf79253d06956b8e9847c9364979d4a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:41 np0005554845 python3.9[144706]: ansible-ansible.legacy.command Invoked with cmd=saslpasswd2 -f /etc/libvirt/passwd.db -p -a libvirt -u openstack migration stdin=12345678 _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None
Dec 11 00:53:42 np0005554845 python3.9[144859]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:43 np0005554845 python3.9[145011]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtlogd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:43 np0005554845 python3.9[145163]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:44 np0005554845 python3.9[145315]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:45 np0005554845 python3.9[145467]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtnodedevd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:45 np0005554845 python3.9[145619]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:46 np0005554845 python3.9[145771]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:47 np0005554845 python3.9[145923]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtproxyd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:47 np0005554845 python3.9[146075]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:48 np0005554845 python3.9[146227]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:49 np0005554845 python3.9[146379]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtqemud-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:49 np0005554845 python3.9[146531]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:50 np0005554845 python3.9[146683]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-ro.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:51 np0005554845 python3.9[146835]: ansible-ansible.builtin.file Invoked with group=root mode=0755 owner=root path=/etc/systemd/system/virtsecretd-admin.socket.d state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:52 np0005554845 python3.9[146987]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:52 np0005554845 python3.9[147110]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432431.8756804-2287-206832613416463/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:53 np0005554845 python3.9[147262]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtlogd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:54 np0005554845 python3.9[147385]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtlogd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432433.2403567-2287-141088106403704/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:55 np0005554845 python3.9[147537]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:55 np0005554845 python3.9[147660]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432434.6266997-2287-254338412131851/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:56 np0005554845 python3.9[147812]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:56 np0005554845 python3.9[147935]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432435.7271357-2287-182161221840824/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:57 np0005554845 python3.9[148087]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:57 np0005554845 podman[148211]: 2025-12-11 05:53:57.737078987 +0000 UTC m=+0.054505894 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 00:53:57 np0005554845 podman[148230]: 2025-12-11 05:53:57.850329562 +0000 UTC m=+0.095772405 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_controller)
Dec 11 00:53:57 np0005554845 python3.9[148210]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtnodedevd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432436.873209-2287-59691176095469/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:58 np0005554845 python3.9[148407]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:53:59 np0005554845 python3.9[148530]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432438.038742-2287-76366091152834/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:53:59 np0005554845 python3.9[148682]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:00 np0005554845 python3.9[148805]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432439.2554193-2287-181986869940067/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:00 np0005554845 python3.9[148957]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:01 np0005554845 python3.9[149080]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtproxyd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432440.500486-2287-159550940399743/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:02 np0005554845 python3.9[149232]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:02 np0005554845 python3.9[149355]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432441.8311656-2287-97404187614220/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:03 np0005554845 python3.9[149507]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:04 np0005554845 python3.9[149630]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432443.094382-2287-112983079368438/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:04 np0005554845 python3.9[149782]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtqemud-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:05 np0005554845 python3.9[149905]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtqemud-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432444.3240523-2287-246956257510048/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:06 np0005554845 python3.9[150057]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:06 np0005554845 python3.9[150180]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432445.6137977-2287-280535589062975/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:07 np0005554845 python3.9[150332]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:08 np0005554845 python3.9[150455]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-ro.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432446.9080834-2287-281198735323079/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:08 np0005554845 python3.9[150607]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:09 np0005554845 python3.9[150730]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virtsecretd-admin.socket.d/override.conf group=root mode=0644 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432448.1769793-2287-229140458307350/.source.conf follow=False _original_basename=libvirt-socket.unit.j2 checksum=0bad41f409b4ee7e780a2a59dc18f5c84ed99826 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:11 np0005554845 python3.9[150880]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail#012ls -lRZ /run/libvirt | grep -E ':container_\S+_t'#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:54:12 np0005554845 python3.9[151035]: ansible-ansible.posix.seboolean Invoked with name=os_enable_vtpm persistent=True state=True ignore_selinux_state=False
Dec 11 00:54:14 np0005554845 dbus-broker-launch[771]: avc:  op=load_policy lsm=selinux seqno=15 res=1
Dec 11 00:54:14 np0005554845 python3.9[151191]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/servercert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:15 np0005554845 python3.9[151343]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/serverkey.pem group=root mode=0600 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:16 np0005554845 python3.9[151495]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/clientcert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:16 np0005554845 python3.9[151647]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/libvirt/private/clientkey.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:17 np0005554845 python3.9[151799]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/CA/cacert.pem group=root mode=0644 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:18 np0005554845 python3.9[151951]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:19 np0005554845 python3.9[152103]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/server-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:19 np0005554845 python3.9[152255]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:20 np0005554845 python3.9[152407]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/client-key.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/tls.key backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:21 np0005554845 python3.9[152559]: ansible-ansible.legacy.copy Invoked with dest=/etc/pki/qemu/ca-cert.pem group=qemu mode=0640 owner=root remote_src=True src=/var/lib/openstack/certs/libvirt/default/ca.crt backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:22 np0005554845 python3.9[152711]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtlogd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:54:22 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:22 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:22 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:22 np0005554845 systemd[1]: Starting libvirt logging daemon socket...
Dec 11 00:54:22 np0005554845 systemd[1]: Listening on libvirt logging daemon socket.
Dec 11 00:54:22 np0005554845 systemd[1]: Starting libvirt logging daemon admin socket...
Dec 11 00:54:22 np0005554845 systemd[1]: Listening on libvirt logging daemon admin socket.
Dec 11 00:54:22 np0005554845 systemd[1]: Starting libvirt logging daemon...
Dec 11 00:54:23 np0005554845 systemd[1]: Started libvirt logging daemon.
Dec 11 00:54:23 np0005554845 python3.9[152904]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtnodedevd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:54:23 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:23 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:23 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:24 np0005554845 systemd[1]: Starting SETroubleshoot daemon for processing new SELinux denial logs...
Dec 11 00:54:24 np0005554845 systemd[1]: Starting libvirt nodedev daemon socket...
Dec 11 00:54:24 np0005554845 systemd[1]: Listening on libvirt nodedev daemon socket.
Dec 11 00:54:24 np0005554845 systemd[1]: Starting libvirt nodedev daemon admin socket...
Dec 11 00:54:24 np0005554845 systemd[1]: Starting libvirt nodedev daemon read-only socket...
Dec 11 00:54:24 np0005554845 systemd[1]: Listening on libvirt nodedev daemon admin socket.
Dec 11 00:54:24 np0005554845 systemd[1]: Listening on libvirt nodedev daemon read-only socket.
Dec 11 00:54:24 np0005554845 systemd[1]: Starting libvirt nodedev daemon...
Dec 11 00:54:24 np0005554845 systemd[1]: Started libvirt nodedev daemon.
Dec 11 00:54:24 np0005554845 systemd[1]: Started SETroubleshoot daemon for processing new SELinux denial logs.
Dec 11 00:54:24 np0005554845 systemd[1]: Created slice Slice /system/dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged.
Dec 11 00:54:24 np0005554845 systemd[1]: Started dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service.
Dec 11 00:54:25 np0005554845 python3.9[153128]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtproxyd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:54:25 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:25 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:25 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:25 np0005554845 systemd[1]: Starting libvirt proxy daemon admin socket...
Dec 11 00:54:25 np0005554845 systemd[1]: Starting libvirt proxy daemon read-only socket...
Dec 11 00:54:25 np0005554845 systemd[1]: Listening on libvirt proxy daemon admin socket.
Dec 11 00:54:25 np0005554845 systemd[1]: Listening on libvirt proxy daemon read-only socket.
Dec 11 00:54:25 np0005554845 systemd[1]: Starting libvirt proxy daemon...
Dec 11 00:54:25 np0005554845 systemd[1]: Started libvirt proxy daemon.
Dec 11 00:54:25 np0005554845 setroubleshoot[152941]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l b10c69ad-df71-437c-8a86-600baa9a4705
Dec 11 00:54:25 np0005554845 setroubleshoot[152941]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec 11 00:54:25 np0005554845 setroubleshoot[152941]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability. For complete SELinux messages run: sealert -l b10c69ad-df71-437c-8a86-600baa9a4705
Dec 11 00:54:25 np0005554845 setroubleshoot[152941]: SELinux is preventing /usr/sbin/virtlogd from using the dac_read_search capability.#012#012*****  Plugin dac_override (91.4 confidence) suggests   **********************#012#012If you want to help identify if domain needs this access or you have a file with the wrong permissions on your system#012Then turn on full auditing to get path information about the offending file and generate the error again.#012Do#012#012Turn on full auditing#012# auditctl -w /etc/shadow -p w#012Try to recreate AVC. Then execute#012# ausearch -m avc -ts recent#012If you see PATH record check ownership/permissions on file, and fix it,#012otherwise report as a bugzilla.#012#012*****  Plugin catchall (9.59 confidence) suggests   **************************#012#012If you believe that virtlogd should have the dac_read_search capability by default.#012Then you should report this as a bug.#012You can generate a local policy module to allow this access.#012Do#012allow this access for now by executing:#012# ausearch -c 'virtlogd' --raw | audit2allow -M my-virtlogd#012# semodule -X 300 -i my-virtlogd.pp#012
Dec 11 00:54:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:54:26.205 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:54:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:54:26.206 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:54:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:54:26.206 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:54:26 np0005554845 python3.9[153342]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtqemud.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:54:26 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:26 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:26 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:26 np0005554845 systemd[1]: Listening on libvirt locking daemon socket.
Dec 11 00:54:26 np0005554845 systemd[1]: Starting libvirt QEMU daemon socket...
Dec 11 00:54:26 np0005554845 systemd[1]: Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw).
Dec 11 00:54:26 np0005554845 systemd[1]: Starting Virtual Machine and Container Registration Service...
Dec 11 00:54:26 np0005554845 systemd[1]: Listening on libvirt QEMU daemon socket.
Dec 11 00:54:26 np0005554845 systemd[1]: Starting libvirt QEMU daemon admin socket...
Dec 11 00:54:26 np0005554845 systemd[1]: Starting libvirt QEMU daemon read-only socket...
Dec 11 00:54:26 np0005554845 systemd[1]: Listening on libvirt QEMU daemon admin socket.
Dec 11 00:54:26 np0005554845 systemd[1]: Listening on libvirt QEMU daemon read-only socket.
Dec 11 00:54:26 np0005554845 systemd[1]: Started Virtual Machine and Container Registration Service.
Dec 11 00:54:26 np0005554845 systemd[1]: Starting libvirt QEMU daemon...
Dec 11 00:54:26 np0005554845 systemd[1]: Started libvirt QEMU daemon.
Dec 11 00:54:27 np0005554845 python3.9[153556]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True name=virtsecretd.service state=restarted daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:54:27 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:27 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:27 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:27 np0005554845 systemd[1]: Starting libvirt secret daemon socket...
Dec 11 00:54:27 np0005554845 systemd[1]: Listening on libvirt secret daemon socket.
Dec 11 00:54:27 np0005554845 systemd[1]: Starting libvirt secret daemon admin socket...
Dec 11 00:54:27 np0005554845 systemd[1]: Starting libvirt secret daemon read-only socket...
Dec 11 00:54:27 np0005554845 systemd[1]: Listening on libvirt secret daemon read-only socket.
Dec 11 00:54:27 np0005554845 systemd[1]: Listening on libvirt secret daemon admin socket.
Dec 11 00:54:27 np0005554845 systemd[1]: Starting libvirt secret daemon...
Dec 11 00:54:27 np0005554845 systemd[1]: Started libvirt secret daemon.
Dec 11 00:54:27 np0005554845 podman[153593]: 2025-12-11 05:54:27.950174361 +0000 UTC m=+0.120331027 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_managed=true, managed_by=edpm_ansible)
Dec 11 00:54:27 np0005554845 podman[153610]: 2025-12-11 05:54:27.961152269 +0000 UTC m=+0.090366393 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 00:54:29 np0005554845 python3.9[153813]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/openstack/config/ceph state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:30 np0005554845 python3.9[153965]: ansible-ansible.builtin.find Invoked with paths=['/var/lib/openstack/config/ceph'] patterns=['*.conf'] read_whole_file=False file_type=file age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 00:54:31 np0005554845 python3.9[154117]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/libvirt.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:32 np0005554845 python3.9[154240]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/libvirt.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432471.3510864-3322-56725134749659/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=5ca83b1310a74c5e48c4c3d4640e1cb8fdac1061 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:33 np0005554845 python3.9[154392]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:34 np0005554845 python3.9[154544]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:34 np0005554845 python3.9[154622]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:35 np0005554845 python3.9[154774]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:35 np0005554845 systemd[1]: dbus-:1.1-org.fedoraproject.SetroubleshootPrivileged@0.service: Deactivated successfully.
Dec 11 00:54:35 np0005554845 systemd[1]: setroubleshootd.service: Deactivated successfully.
Dec 11 00:54:36 np0005554845 python3.9[154852]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.7k1bg_ru recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:37 np0005554845 python3.9[155004]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:37 np0005554845 python3.9[155082]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:38 np0005554845 python3.9[155234]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:54:39 np0005554845 python3[155387]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 00:54:40 np0005554845 python3.9[155539]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:40 np0005554845 python3.9[155617]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:41 np0005554845 python3.9[155769]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:42 np0005554845 python3.9[155847]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:43 np0005554845 python3.9[155999]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:44 np0005554845 python3.9[156077]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:45 np0005554845 python3.9[156229]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:45 np0005554845 python3.9[156307]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:46 np0005554845 python3.9[156459]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:47 np0005554845 python3.9[156584]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432485.8949218-3698-22314601967869/.source.nft follow=False _original_basename=ruleset.j2 checksum=8a12d4eb5149b6e500230381c1359a710881e9b0 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:47 np0005554845 python3.9[156736]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:48 np0005554845 python3.9[156888]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:54:49 np0005554845 python3.9[157043]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:50 np0005554845 python3.9[157195]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:54:51 np0005554845 python3.9[157348]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:54:52 np0005554845 python3.9[157502]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:54:53 np0005554845 python3.9[157657]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:54 np0005554845 python3.9[157809]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:54 np0005554845 python3.9[157932]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432493.5485306-3914-81837494306911/.source.target follow=False _original_basename=edpm_libvirt.target checksum=13035a1aa0f414c677b14be9a5a363b6623d393c backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:55 np0005554845 python3.9[158084]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm_libvirt_guests.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:56 np0005554845 python3.9[158207]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/edpm_libvirt_guests.service mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432495.0627525-3958-272860882811167/.source.service follow=False _original_basename=edpm_libvirt_guests.service checksum=db83430a42fc2ccfd6ed8b56ebf04f3dff9cd0cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:56 np0005554845 python3.9[158359]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/virt-guest-shutdown.target follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:54:57 np0005554845 python3.9[158482]: ansible-ansible.legacy.copy Invoked with dest=/etc/systemd/system/virt-guest-shutdown.target mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432496.4387095-4004-180644082597615/.source.target follow=False _original_basename=virt-guest-shutdown.target checksum=49ca149619c596cbba877418629d2cf8f7b0f5cf backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:54:58 np0005554845 podman[158572]: 2025-12-11 05:54:58.167177105 +0000 UTC m=+0.092657016 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 00:54:58 np0005554845 podman[158583]: 2025-12-11 05:54:58.17951617 +0000 UTC m=+0.101386863 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 11 00:54:58 np0005554845 python3.9[158677]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt.target state=restarted daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:54:58 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:58 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:54:58 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:58 np0005554845 systemd[1]: Reached target edpm_libvirt.target.
Dec 11 00:54:59 np0005554845 python3.9[158868]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm_libvirt_guests daemon_reexec=False scope=system no_block=False state=None force=None masked=None
Dec 11 00:54:59 np0005554845 systemd[1]: Reloading.
Dec 11 00:54:59 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:54:59 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:55:00 np0005554845 systemd[1]: Reloading.
Dec 11 00:55:00 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:55:00 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:55:01 np0005554845 systemd[1]: session-24.scope: Deactivated successfully.
Dec 11 00:55:01 np0005554845 systemd[1]: session-24.scope: Consumed 3min 21.840s CPU time.
Dec 11 00:55:01 np0005554845 systemd-logind[789]: Session 24 logged out. Waiting for processes to exit.
Dec 11 00:55:01 np0005554845 systemd-logind[789]: Removed session 24.
Dec 11 00:55:06 np0005554845 systemd-logind[789]: New session 25 of user zuul.
Dec 11 00:55:06 np0005554845 systemd[1]: Started Session 25 of User zuul.
Dec 11 00:55:07 np0005554845 python3.9[159119]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:55:08 np0005554845 python3.9[159273]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:55:08 np0005554845 network[159290]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:55:08 np0005554845 network[159291]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:55:08 np0005554845 network[159292]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:55:15 np0005554845 python3.9[159563]: ansible-ansible.legacy.setup Invoked with filter=['ansible_pkg_mgr'] gather_subset=['!all'] gather_timeout=10 fact_path=/etc/ansible/facts.d
Dec 11 00:55:16 np0005554845 python3.9[159647]: ansible-ansible.legacy.dnf Invoked with name=['iscsi-initiator-utils'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:55:23 np0005554845 python3.9[159800]: ansible-ansible.builtin.stat Invoked with path=/var/lib/config-data/puppet-generated/iscsid/etc/iscsi follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:55:23 np0005554845 python3.9[159952]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/restorecon -nvr /etc/iscsi /var/lib/iscsi _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:55:24 np0005554845 python3.9[160105]: ansible-ansible.builtin.stat Invoked with path=/etc/iscsi/.initiator_reset follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:55:25 np0005554845 python3.9[160257]: ansible-ansible.legacy.command Invoked with _raw_params=/usr/sbin/iscsi-iname _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:55:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:55:26.206 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:55:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:55:26.207 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:55:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:55:26.207 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:55:26 np0005554845 python3.9[160410]: ansible-ansible.legacy.stat Invoked with path=/etc/iscsi/initiatorname.iscsi follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:55:27 np0005554845 python3.9[160533]: ansible-ansible.legacy.copy Invoked with dest=/etc/iscsi/initiatorname.iscsi mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432526.2221463-248-135283919372505/.source.iscsi _original_basename=.9u2uqxo7 follow=False checksum=8e73b0d08c2f2b5b51f77a9e08d9f7568d095eae backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:28 np0005554845 podman[160685]: 2025-12-11 05:55:28.304455839 +0000 UTC m=+0.051407512 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 00:55:28 np0005554845 podman[160686]: 2025-12-11 05:55:28.338361769 +0000 UTC m=+0.086276469 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 00:55:28 np0005554845 python3.9[160687]: ansible-ansible.builtin.file Invoked with mode=0600 path=/etc/iscsi/.initiator_reset state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:29 np0005554845 python3.9[160884]: ansible-ansible.builtin.lineinfile Invoked with insertafter=^#node.session.auth.chap.algs line=node.session.auth.chap_algs = SHA3-256,SHA256,SHA1,MD5 path=/etc/iscsi/iscsid.conf regexp=^node.session.auth.chap_algs state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:29 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:55:29 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:55:30 np0005554845 python3.9[161037]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:55:30 np0005554845 systemd[1]: Listening on Open-iSCSI iscsid Socket.
Dec 11 00:55:31 np0005554845 python3.9[161193]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=iscsid state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:55:32 np0005554845 systemd[1]: Reloading.
Dec 11 00:55:32 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:55:32 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:55:33 np0005554845 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 11 00:55:33 np0005554845 systemd[1]: Starting Open-iSCSI...
Dec 11 00:55:33 np0005554845 kernel: Loading iSCSI transport class v2.0-870.
Dec 11 00:55:33 np0005554845 systemd[1]: Started Open-iSCSI.
Dec 11 00:55:33 np0005554845 systemd[1]: Starting Logout off all iSCSI sessions on shutdown...
Dec 11 00:55:33 np0005554845 systemd[1]: Finished Logout off all iSCSI sessions on shutdown.
Dec 11 00:55:34 np0005554845 python3.9[161396]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:55:34 np0005554845 network[161413]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:55:34 np0005554845 network[161414]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:55:34 np0005554845 network[161415]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:55:39 np0005554845 python3.9[161686]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 00:55:40 np0005554845 python3.9[161838]: ansible-community.general.modprobe Invoked with name=dm-multipath state=present params= persistent=disabled
Dec 11 00:55:41 np0005554845 python3.9[161994]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/dm-multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:55:42 np0005554845 python3.9[162117]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/dm-multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432541.080617-479-85670303114965/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=065061c60917e4f67cecc70d12ce55e42f9d0b3f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:43 np0005554845 python3.9[162269]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=dm-multipath  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:44 np0005554845 python3.9[162421]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:55:44 np0005554845 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 11 00:55:44 np0005554845 systemd[1]: Stopped Load Kernel Modules.
Dec 11 00:55:44 np0005554845 systemd[1]: Stopping Load Kernel Modules...
Dec 11 00:55:44 np0005554845 systemd[1]: Starting Load Kernel Modules...
Dec 11 00:55:44 np0005554845 systemd[1]: Finished Load Kernel Modules.
Dec 11 00:55:45 np0005554845 python3.9[162577]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:55:46 np0005554845 python3.9[162729]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:55:46 np0005554845 python3.9[162881]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:55:47 np0005554845 python3.9[163033]: ansible-ansible.legacy.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:55:48 np0005554845 python3.9[163156]: ansible-ansible.legacy.copy Invoked with dest=/etc/multipath.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432547.1290963-653-126765367374660/.source.conf _original_basename=multipath.conf follow=False checksum=bf02ab264d3d648048a81f3bacec8bc58db93162 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:49 np0005554845 python3.9[163309]: ansible-ansible.legacy.command Invoked with _raw_params=grep -q '^blacklist\s*{' /etc/multipath.conf _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:55:50 np0005554845 python3.9[163462]: ansible-ansible.builtin.lineinfile Invoked with line=blacklist { path=/etc/multipath.conf state=present encoding=utf-8 backrefs=False create=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:50 np0005554845 python3.9[163614]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^(blacklist {) replace=\1\n} backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:51 np0005554845 python3.9[163766]: ansible-ansible.builtin.replace Invoked with path=/etc/multipath.conf regexp=^blacklist\s*{\n[\s]+devnode \"\.\*\" replace=blacklist { backup=False encoding=utf-8 unsafe_writes=False after=None before=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:52 np0005554845 python3.9[163918]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        find_multipaths yes path=/etc/multipath.conf regexp=^\s+find_multipaths state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:53 np0005554845 python3.9[164070]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        recheck_wwid yes path=/etc/multipath.conf regexp=^\s+recheck_wwid state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:53 np0005554845 python3.9[164222]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        skip_kpartx yes path=/etc/multipath.conf regexp=^\s+skip_kpartx state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:54 np0005554845 python3.9[164374]: ansible-ansible.builtin.lineinfile Invoked with firstmatch=True insertafter=^defaults line=        user_friendly_names no path=/etc/multipath.conf regexp=^\s+user_friendly_names state=present encoding=utf-8 backrefs=False create=False backup=False unsafe_writes=False search_string=None insertbefore=None validate=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:55 np0005554845 python3.9[164526]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath.conf follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:55:56 np0005554845 python3.9[164680]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/multipath/.multipath_restart_required state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:55:57 np0005554845 python3.9[164832]: ansible-ansible.builtin.file Invoked with path=/var/local/libexec recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:55:58 np0005554845 python3.9[164984]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-container-shutdown follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:55:58 np0005554845 podman[165062]: 2025-12-11 05:55:58.441341246 +0000 UTC m=+0.067414570 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 00:55:58 np0005554845 podman[165063]: 2025-12-11 05:55:58.479109827 +0000 UTC m=+0.105394217 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 11 00:55:58 np0005554845 python3.9[165064]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-container-shutdown _original_basename=edpm-container-shutdown recurse=False state=file path=/var/local/libexec/edpm-container-shutdown force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:55:59 np0005554845 python3.9[165262]: ansible-ansible.legacy.stat Invoked with path=/var/local/libexec/edpm-start-podman-container follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:55:59 np0005554845 python3.9[165340]: ansible-ansible.legacy.file Invoked with group=root mode=0700 owner=root setype=container_file_t dest=/var/local/libexec/edpm-start-podman-container _original_basename=edpm-start-podman-container recurse=False state=file path=/var/local/libexec/edpm-start-podman-container force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:56:00 np0005554845 python3.9[165492]: ansible-ansible.builtin.file Invoked with mode=420 path=/etc/systemd/system-preset state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:01 np0005554845 python3.9[165644]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/edpm-container-shutdown.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:01 np0005554845 python3.9[165722]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/edpm-container-shutdown.service _original_basename=edpm-container-shutdown-service recurse=False state=file path=/etc/systemd/system/edpm-container-shutdown.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:03 np0005554845 python3.9[165874]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:03 np0005554845 python3.9[165952]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-edpm-container-shutdown.preset _original_basename=91-edpm-container-shutdown-preset recurse=False state=file path=/etc/systemd/system-preset/91-edpm-container-shutdown.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:04 np0005554845 python3.9[166104]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=edpm-container-shutdown state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:04 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:04 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:04 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:05 np0005554845 python3.9[166293]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system/netns-placeholder.service follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:05 np0005554845 python3.9[166371]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system/netns-placeholder.service _original_basename=netns-placeholder-service recurse=False state=file path=/etc/systemd/system/netns-placeholder.service force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:06 np0005554845 python3.9[166523]: ansible-ansible.legacy.stat Invoked with path=/etc/systemd/system-preset/91-netns-placeholder.preset follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:07 np0005554845 python3.9[166601]: ansible-ansible.legacy.file Invoked with group=root mode=0644 owner=root dest=/etc/systemd/system-preset/91-netns-placeholder.preset _original_basename=91-netns-placeholder-preset recurse=False state=file path=/etc/systemd/system-preset/91-netns-placeholder.preset force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:08 np0005554845 python3.9[166753]: ansible-ansible.builtin.systemd Invoked with daemon_reload=True enabled=True name=netns-placeholder state=started daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:08 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:08 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:08 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:08 np0005554845 systemd[1]: Starting Create netns directory...
Dec 11 00:56:08 np0005554845 systemd[1]: run-netns-placeholder.mount: Deactivated successfully.
Dec 11 00:56:08 np0005554845 systemd[1]: netns-placeholder.service: Deactivated successfully.
Dec 11 00:56:08 np0005554845 systemd[1]: Finished Create netns directory.
Dec 11 00:56:09 np0005554845 python3.9[166948]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:56:10 np0005554845 python3.9[167100]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/multipathd/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:10 np0005554845 python3.9[167223]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/multipathd/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432569.8095396-1273-130284055517685/.source _original_basename=healthcheck follow=False checksum=af9d0c1c8f3cb0e30ce9609be9d5b01924d0d23f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:56:12 np0005554845 python3.9[167375]: ansible-ansible.builtin.file Invoked with path=/var/lib/kolla/config_files recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:56:12 np0005554845 python3.9[167527]: ansible-ansible.legacy.stat Invoked with path=/var/lib/kolla/config_files/multipathd.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:13 np0005554845 python3.9[167650]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/kolla/config_files/multipathd.json mode=0600 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432572.3181376-1348-228851631248185/.source.json _original_basename=.jxd44l_a follow=False checksum=3f7959ee8ac9757398adcc451c3b416c957d7c14 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:14 np0005554845 python3.9[167802]: ansible-ansible.builtin.file Invoked with mode=0755 path=/var/lib/edpm-config/container-startup-config/multipathd state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:16 np0005554845 python3.9[168229]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/edpm-config/container-startup-config/multipathd config_pattern=*.json debug=False
Dec 11 00:56:17 np0005554845 python3.9[168381]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:56:18 np0005554845 python3.9[168533]: ansible-containers.podman.podman_container_info Invoked with executable=podman name=None
Dec 11 00:56:20 np0005554845 python3[168712]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/edpm-config/container-startup-config/multipathd config_id=multipathd config_overrides={} config_patterns=*.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:56:21 np0005554845 podman[168748]: 2025-12-11 05:56:20.956170454 +0000 UTC m=+0.022182946 image pull bcd3898ac099c7fff3d2ff3fc32de931119ed36068f8a2617bd8fa95e51d1b81 quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 11 00:56:21 np0005554845 podman[168748]: 2025-12-11 05:56:21.191005793 +0000 UTC m=+0.257018305 container create eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 00:56:21 np0005554845 python3[168712]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name multipathd --conmon-pidfile /run/multipathd.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --healthcheck-command /openstack/healthcheck --label config_id=multipathd --label container_name=multipathd --label managed_by=edpm_ansible --label config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --volume /etc/hosts:/etc/hosts:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro --volume /dev/log:/dev/log --volume /var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro --volume /dev:/dev --volume /run/udev:/run/udev --volume /sys:/sys --volume /lib/modules:/lib/modules:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /var/lib/openstack/healthchecks/multipathd:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-multipathd:current-podified
Dec 11 00:56:22 np0005554845 python3.9[168937]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:56:23 np0005554845 python3.9[169091]: ansible-file Invoked with path=/etc/systemd/system/edpm_multipathd.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:23 np0005554845 python3.9[169167]: ansible-stat Invoked with path=/etc/systemd/system/edpm_multipathd_healthcheck.timer follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:56:24 np0005554845 systemd[1]: virtnodedevd.service: Deactivated successfully.
Dec 11 00:56:24 np0005554845 python3.9[169318]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432583.6370432-1612-175695991351140/source dest=/etc/systemd/system/edpm_multipathd.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:24 np0005554845 python3.9[169395]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:56:24 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:24 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:24 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:25 np0005554845 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 11 00:56:25 np0005554845 python3.9[169505]: ansible-systemd Invoked with state=restarted name=edpm_multipathd.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:25 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:25 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:25 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:26 np0005554845 systemd[1]: Starting multipathd container...
Dec 11 00:56:26 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:56:26 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9d60e6cabeb92b85024876565070a731ab60c0460aa7b6de8d186219e352804/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 00:56:26 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9d60e6cabeb92b85024876565070a731ab60c0460aa7b6de8d186219e352804/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 00:56:26 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.
Dec 11 00:56:26 np0005554845 podman[169546]: 2025-12-11 05:56:26.202804883 +0000 UTC m=+0.133574694 container init eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_managed=true)
Dec 11 00:56:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:56:26.207 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:56:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:56:26.208 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:56:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:56:26.208 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:56:26 np0005554845 multipathd[169561]: + sudo -E kolla_set_configs
Dec 11 00:56:26 np0005554845 podman[169546]: 2025-12-11 05:56:26.232106831 +0000 UTC m=+0.162876622 container start eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team)
Dec 11 00:56:26 np0005554845 podman[169546]: multipathd
Dec 11 00:56:26 np0005554845 systemd[1]: Started multipathd container.
Dec 11 00:56:26 np0005554845 multipathd[169561]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:56:26 np0005554845 multipathd[169561]: INFO:__main__:Validating config file
Dec 11 00:56:26 np0005554845 multipathd[169561]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:56:26 np0005554845 multipathd[169561]: INFO:__main__:Writing out command to execute
Dec 11 00:56:26 np0005554845 multipathd[169561]: ++ cat /run_command
Dec 11 00:56:26 np0005554845 multipathd[169561]: + CMD='/usr/sbin/multipathd -d'
Dec 11 00:56:26 np0005554845 multipathd[169561]: + ARGS=
Dec 11 00:56:26 np0005554845 multipathd[169561]: + sudo kolla_copy_cacerts
Dec 11 00:56:26 np0005554845 multipathd[169561]: + [[ ! -n '' ]]
Dec 11 00:56:26 np0005554845 multipathd[169561]: + . kolla_extend_start
Dec 11 00:56:26 np0005554845 multipathd[169561]: Running command: '/usr/sbin/multipathd -d'
Dec 11 00:56:26 np0005554845 multipathd[169561]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 11 00:56:26 np0005554845 multipathd[169561]: + umask 0022
Dec 11 00:56:26 np0005554845 multipathd[169561]: + exec /usr/sbin/multipathd -d
Dec 11 00:56:26 np0005554845 podman[169568]: 2025-12-11 05:56:26.324207757 +0000 UTC m=+0.081987048 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 00:56:26 np0005554845 systemd[1]: eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-72fec90e70800ce0.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 00:56:26 np0005554845 systemd[1]: eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-72fec90e70800ce0.service: Failed with result 'exit-code'.
Dec 11 00:56:26 np0005554845 multipathd[169561]: 2861.032372 | --------start up--------
Dec 11 00:56:26 np0005554845 multipathd[169561]: 2861.032384 | read /etc/multipath.conf
Dec 11 00:56:26 np0005554845 multipathd[169561]: 2861.037053 | path checkers start up
Dec 11 00:56:26 np0005554845 systemd[1]: virtqemud.service: Deactivated successfully.
Dec 11 00:56:27 np0005554845 python3.9[169751]: ansible-ansible.builtin.stat Invoked with path=/etc/multipath/.multipath_restart_required follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:56:27 np0005554845 systemd[1]: virtsecretd.service: Deactivated successfully.
Dec 11 00:56:28 np0005554845 python3.9[169905]: ansible-ansible.legacy.command Invoked with _raw_params=podman ps --filter volume=/etc/multipath.conf --format {{.Names}} _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:56:28 np0005554845 podman[170043]: 2025-12-11 05:56:28.822206396 +0000 UTC m=+0.060119054 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent)
Dec 11 00:56:28 np0005554845 podman[170044]: 2025-12-11 05:56:28.873772129 +0000 UTC m=+0.110689430 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true)
Dec 11 00:56:29 np0005554845 python3.9[170110]: ansible-ansible.builtin.systemd Invoked with name=edpm_multipathd state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:56:30 np0005554845 systemd[1]: Stopping multipathd container...
Dec 11 00:56:30 np0005554845 multipathd[169561]: 2864.949916 | exit (signal)
Dec 11 00:56:30 np0005554845 multipathd[169561]: 2864.950052 | --------shut down-------
Dec 11 00:56:30 np0005554845 systemd[1]: libpod-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.scope: Deactivated successfully.
Dec 11 00:56:30 np0005554845 podman[170119]: 2025-12-11 05:56:30.280103951 +0000 UTC m=+0.073555462 container died eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 11 00:56:30 np0005554845 systemd[1]: eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-72fec90e70800ce0.timer: Deactivated successfully.
Dec 11 00:56:30 np0005554845 systemd[1]: Stopped /usr/bin/podman healthcheck run eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.
Dec 11 00:56:30 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-userdata-shm.mount: Deactivated successfully.
Dec 11 00:56:30 np0005554845 systemd[1]: var-lib-containers-storage-overlay-f9d60e6cabeb92b85024876565070a731ab60c0460aa7b6de8d186219e352804-merged.mount: Deactivated successfully.
Dec 11 00:56:30 np0005554845 podman[170119]: 2025-12-11 05:56:30.343798244 +0000 UTC m=+0.137249695 container cleanup eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 00:56:30 np0005554845 podman[170119]: multipathd
Dec 11 00:56:30 np0005554845 podman[170149]: multipathd
Dec 11 00:56:30 np0005554845 systemd[1]: edpm_multipathd.service: Deactivated successfully.
Dec 11 00:56:30 np0005554845 systemd[1]: Stopped multipathd container.
Dec 11 00:56:30 np0005554845 systemd[1]: Starting multipathd container...
Dec 11 00:56:30 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:56:30 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9d60e6cabeb92b85024876565070a731ab60c0460aa7b6de8d186219e352804/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 00:56:30 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f9d60e6cabeb92b85024876565070a731ab60c0460aa7b6de8d186219e352804/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 00:56:30 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.
Dec 11 00:56:30 np0005554845 podman[170163]: 2025-12-11 05:56:30.650760591 +0000 UTC m=+0.178987593 container init eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3)
Dec 11 00:56:30 np0005554845 multipathd[170179]: + sudo -E kolla_set_configs
Dec 11 00:56:30 np0005554845 podman[170163]: 2025-12-11 05:56:30.689834534 +0000 UTC m=+0.218061526 container start eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 00:56:30 np0005554845 podman[170163]: multipathd
Dec 11 00:56:30 np0005554845 systemd[1]: Started multipathd container.
Dec 11 00:56:30 np0005554845 multipathd[170179]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:56:30 np0005554845 multipathd[170179]: INFO:__main__:Validating config file
Dec 11 00:56:30 np0005554845 multipathd[170179]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:56:30 np0005554845 multipathd[170179]: INFO:__main__:Writing out command to execute
Dec 11 00:56:30 np0005554845 podman[170186]: 2025-12-11 05:56:30.767242673 +0000 UTC m=+0.065607369 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=starting, health_failing_streak=1, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:56:30 np0005554845 multipathd[170179]: ++ cat /run_command
Dec 11 00:56:30 np0005554845 multipathd[170179]: + CMD='/usr/sbin/multipathd -d'
Dec 11 00:56:30 np0005554845 multipathd[170179]: + ARGS=
Dec 11 00:56:30 np0005554845 multipathd[170179]: + sudo kolla_copy_cacerts
Dec 11 00:56:30 np0005554845 systemd[1]: eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-40e18dd877c9fe10.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 00:56:30 np0005554845 systemd[1]: eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec-40e18dd877c9fe10.service: Failed with result 'exit-code'.
Dec 11 00:56:30 np0005554845 multipathd[170179]: + [[ ! -n '' ]]
Dec 11 00:56:30 np0005554845 multipathd[170179]: + . kolla_extend_start
Dec 11 00:56:30 np0005554845 multipathd[170179]: Running command: '/usr/sbin/multipathd -d'
Dec 11 00:56:30 np0005554845 multipathd[170179]: + echo 'Running command: '\''/usr/sbin/multipathd -d'\'''
Dec 11 00:56:30 np0005554845 multipathd[170179]: + umask 0022
Dec 11 00:56:30 np0005554845 multipathd[170179]: + exec /usr/sbin/multipathd -d
Dec 11 00:56:30 np0005554845 multipathd[170179]: 2865.508580 | --------start up--------
Dec 11 00:56:30 np0005554845 multipathd[170179]: 2865.508601 | read /etc/multipath.conf
Dec 11 00:56:30 np0005554845 multipathd[170179]: 2865.513613 | path checkers start up
Dec 11 00:56:31 np0005554845 python3.9[170369]: ansible-ansible.builtin.file Invoked with path=/etc/multipath/.multipath_restart_required state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:32 np0005554845 python3.9[170521]: ansible-ansible.builtin.file Invoked with mode=0755 path=/etc/modules-load.d selevel=s0 setype=etc_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None attributes=None
Dec 11 00:56:33 np0005554845 python3.9[170673]: ansible-community.general.modprobe Invoked with name=nvme-fabrics state=present params= persistent=disabled
Dec 11 00:56:33 np0005554845 kernel: Key type psk registered
Dec 11 00:56:34 np0005554845 python3.9[170836]: ansible-ansible.legacy.stat Invoked with path=/etc/modules-load.d/nvme-fabrics.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:56:34 np0005554845 python3.9[170959]: ansible-ansible.legacy.copy Invoked with dest=/etc/modules-load.d/nvme-fabrics.conf mode=0644 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432593.7290964-1853-276188635729741/.source.conf follow=False _original_basename=module-load.conf.j2 checksum=783c778f0c68cc414f35486f234cbb1cf3f9bbff backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:35 np0005554845 python3.9[171111]: ansible-ansible.builtin.lineinfile Invoked with create=True dest=/etc/modules line=nvme-fabrics  mode=0644 state=present path=/etc/modules encoding=utf-8 backrefs=False backup=False firstmatch=False unsafe_writes=False regexp=None search_string=None insertafter=None insertbefore=None validate=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:36 np0005554845 python3.9[171263]: ansible-ansible.builtin.systemd Invoked with name=systemd-modules-load.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:56:36 np0005554845 systemd[1]: systemd-modules-load.service: Deactivated successfully.
Dec 11 00:56:36 np0005554845 systemd[1]: Stopped Load Kernel Modules.
Dec 11 00:56:36 np0005554845 systemd[1]: Stopping Load Kernel Modules...
Dec 11 00:56:36 np0005554845 systemd[1]: Starting Load Kernel Modules...
Dec 11 00:56:36 np0005554845 systemd[1]: Finished Load Kernel Modules.
Dec 11 00:56:37 np0005554845 python3.9[171419]: ansible-ansible.legacy.dnf Invoked with name=['nvme-cli'] state=present allow_downgrade=False allowerasing=False autoremove=False bugfix=False cacheonly=False disable_gpg_check=False disable_plugin=[] disablerepo=[] download_only=False enable_plugin=[] enablerepo=[] exclude=[] installroot=/ install_weak_deps=True security=False skip_broken=False update_cache=False update_only=False validate_certs=True sslverify=True lock_timeout=30 use_backend=auto best=None conf_file=None disable_excludes=None download_dir=None list=None nobest=None releasever=None
Dec 11 00:56:40 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:40 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:40 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:40 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:40 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:40 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:40 np0005554845 systemd-logind[789]: Watching system buttons on /dev/input/event0 (Power Button)
Dec 11 00:56:40 np0005554845 systemd-logind[789]: Watching system buttons on /dev/input/event1 (AT Translated Set 2 keyboard)
Dec 11 00:56:40 np0005554845 systemd[1]: Started /usr/bin/systemctl start man-db-cache-update.
Dec 11 00:56:40 np0005554845 systemd[1]: Starting man-db-cache-update.service...
Dec 11 00:56:40 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:41 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:41 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:41 np0005554845 systemd[1]: Queuing reload/restart jobs for marked units…
Dec 11 00:56:42 np0005554845 systemd[1]: man-db-cache-update.service: Deactivated successfully.
Dec 11 00:56:42 np0005554845 systemd[1]: Finished man-db-cache-update.service.
Dec 11 00:56:42 np0005554845 systemd[1]: man-db-cache-update.service: Consumed 1.740s CPU time.
Dec 11 00:56:42 np0005554845 systemd[1]: run-rcc36e501d1cb4376a8d804c41dd55043.service: Deactivated successfully.
Dec 11 00:56:43 np0005554845 python3.9[172885]: ansible-ansible.builtin.systemd_service Invoked with name=iscsid state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:56:43 np0005554845 systemd[1]: Stopping Open-iSCSI...
Dec 11 00:56:43 np0005554845 iscsid[161236]: iscsid shutting down.
Dec 11 00:56:43 np0005554845 systemd[1]: iscsid.service: Deactivated successfully.
Dec 11 00:56:43 np0005554845 systemd[1]: Stopped Open-iSCSI.
Dec 11 00:56:43 np0005554845 systemd[1]: One time configuration for iscsi.service was skipped because of an unmet condition check (ConditionPathExists=!/etc/iscsi/initiatorname.iscsi).
Dec 11 00:56:43 np0005554845 systemd[1]: Starting Open-iSCSI...
Dec 11 00:56:43 np0005554845 systemd[1]: Started Open-iSCSI.
Dec 11 00:56:44 np0005554845 python3.9[173039]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:56:45 np0005554845 python3.9[173195]: ansible-ansible.builtin.file Invoked with mode=0644 path=/etc/ssh/ssh_known_hosts state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:56:46 np0005554845 python3.9[173347]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:56:46 np0005554845 systemd[1]: Reloading.
Dec 11 00:56:46 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:56:46 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:56:47 np0005554845 python3.9[173533]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:56:47 np0005554845 network[173550]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:56:47 np0005554845 network[173551]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:56:47 np0005554845 network[173552]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:56:52 np0005554845 python3.9[173826]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:53 np0005554845 python3.9[173979]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_migration_target.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:54 np0005554845 python3.9[174132]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api_cron.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:55 np0005554845 python3.9[174285]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_api.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:56 np0005554845 python3.9[174438]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_conductor.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:57 np0005554845 python3.9[174591]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_metadata.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:57 np0005554845 python3.9[174744]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_scheduler.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:58 np0005554845 python3.9[174897]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_nova_vnc_proxy.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:56:59 np0005554845 podman[174923]: 2025-12-11 05:56:59.151654287 +0000 UTC m=+0.077693146 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 00:56:59 np0005554845 podman[174924]: 2025-12-11 05:56:59.174148357 +0000 UTC m=+0.106934146 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_controller)
Dec 11 00:56:59 np0005554845 python3.9[175095]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:00 np0005554845 python3.9[175247]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:01 np0005554845 podman[175350]: 2025-12-11 05:57:01.136831358 +0000 UTC m=+0.065029322 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, org.label-schema.build-date=20251202)
Dec 11 00:57:01 np0005554845 python3.9[175421]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:02 np0005554845 python3.9[175573]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:02 np0005554845 python3.9[175725]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:03 np0005554845 python3.9[175877]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:04 np0005554845 python3.9[176029]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:04 np0005554845 python3.9[176181]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:05 np0005554845 python3.9[176333]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:06 np0005554845 python3.9[176485]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_migration_target.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:07 np0005554845 python3.9[176637]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api_cron.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:08 np0005554845 python3.9[176789]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_api.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:08 np0005554845 python3.9[176941]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_conductor.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:09 np0005554845 python3.9[177093]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_metadata.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:10 np0005554845 python3.9[177245]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_scheduler.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:10 np0005554845 python3.9[177397]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_nova_vnc_proxy.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:11 np0005554845 python3.9[177549]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:12 np0005554845 python3.9[177701]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 00:57:13 np0005554845 python3.9[177853]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:57:13 np0005554845 systemd[1]: Reloading.
Dec 11 00:57:14 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:57:14 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:57:15 np0005554845 python3.9[178041]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:15 np0005554845 python3.9[178194]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_migration_target.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:16 np0005554845 python3.9[178347]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api_cron.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:17 np0005554845 python3.9[178500]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_api.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:17 np0005554845 python3.9[178653]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_conductor.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:18 np0005554845 python3.9[178806]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_metadata.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:19 np0005554845 python3.9[178959]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_scheduler.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:20 np0005554845 python3.9[179112]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_nova_vnc_proxy.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:57:21 np0005554845 python3.9[179265]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:22 np0005554845 python3.9[179417]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/containers setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:23 np0005554845 python3.9[179569]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/config/nova_nvme_cleaner setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:24 np0005554845 python3.9[179721]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:24 np0005554845 python3.9[179873]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/_nova_secontext setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:25 np0005554845 python3.9[180025]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/nova/instances setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:26 np0005554845 python3.9[180177]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/etc/ceph setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:57:26.208 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:57:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:57:26.209 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:57:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:57:26.209 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:57:26 np0005554845 python3.9[180329]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/multipath setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:27 np0005554845 python3.9[180481]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/etc/nvme setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:28 np0005554845 python3.9[180634]: ansible-ansible.builtin.file Invoked with group=zuul owner=zuul path=/run/openvswitch setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:30 np0005554845 podman[180660]: 2025-12-11 05:57:30.162259482 +0000 UTC m=+0.088603002 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 00:57:30 np0005554845 podman[180659]: 2025-12-11 05:57:30.161474351 +0000 UTC m=+0.087295857 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 00:57:32 np0005554845 podman[180704]: 2025-12-11 05:57:32.140062168 +0000 UTC m=+0.065016372 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 00:57:34 np0005554845 python3.9[180851]: ansible-ansible.builtin.getent Invoked with database=passwd key=nova fail_key=True service=None split=None
Dec 11 00:57:35 np0005554845 python3.9[181004]: ansible-ansible.builtin.group Invoked with gid=42436 name=nova state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:57:36 np0005554845 python3.9[181162]: ansible-ansible.builtin.user Invoked with comment=nova user group=nova groups=['libvirt'] name=nova shell=/bin/sh state=present uid=42436 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 00:57:37 np0005554845 systemd-logind[789]: New session 26 of user zuul.
Dec 11 00:57:37 np0005554845 systemd[1]: Started Session 26 of User zuul.
Dec 11 00:57:38 np0005554845 systemd[1]: session-26.scope: Deactivated successfully.
Dec 11 00:57:38 np0005554845 systemd-logind[789]: Session 26 logged out. Waiting for processes to exit.
Dec 11 00:57:38 np0005554845 systemd-logind[789]: Removed session 26.
Dec 11 00:57:39 np0005554845 python3.9[181348]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/config.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:39 np0005554845 python3.9[181469]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/config.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432658.4410636-3416-105534094411563/.source.json follow=False _original_basename=config.json.j2 checksum=b51012bfb0ca26296dcf3793a2f284446fb1395e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:40 np0005554845 python3.9[181619]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova-blank.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:40 np0005554845 python3.9[181695]: ansible-ansible.legacy.file Invoked with mode=0644 setype=container_file_t dest=/var/lib/openstack/config/nova/nova-blank.conf _original_basename=nova-blank.conf recurse=False state=file path=/var/lib/openstack/config/nova/nova-blank.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:41 np0005554845 python3.9[181845]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/ssh-config follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:42 np0005554845 python3.9[181966]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/ssh-config mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432660.9336355-3416-279087287160762/.source follow=False _original_basename=ssh-config checksum=4297f735c41bdc1ff52d72e6f623a02242f37958 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:43 np0005554845 python3.9[182116]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/02-nova-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:43 np0005554845 python3.9[182237]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/02-nova-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432662.567722-3416-204829068673441/.source.conf follow=False _original_basename=02-nova-host-specific.conf.j2 checksum=d01cc1b48d783e4ed08d12bb4d0a107aba230a69 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:44 np0005554845 python3.9[182387]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/nova_statedir_ownership.py follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:45 np0005554845 python3.9[182508]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/nova_statedir_ownership.py mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432663.8963358-3416-133774812588055/.source.py follow=False _original_basename=nova_statedir_ownership.py checksum=c6c8a3cfefa5efd60ceb1408c4e977becedb71e2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:45 np0005554845 python3.9[182658]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/nova/run-on-host follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:46 np0005554845 python3.9[182779]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/nova/run-on-host mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432665.2479422-3416-183635604834200/.source follow=False _original_basename=run-on-host checksum=93aba8edc83d5878604a66d37fea2f12b60bdea2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:47 np0005554845 python3.9[182931]: ansible-ansible.builtin.file Invoked with group=nova mode=0700 owner=nova path=/home/nova/.ssh state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:48 np0005554845 python3.9[183083]: ansible-ansible.legacy.copy Invoked with dest=/home/nova/.ssh/authorized_keys group=nova mode=0600 owner=nova remote_src=True src=/var/lib/openstack/config/nova/ssh-publickey backup=False force=True follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:57:49 np0005554845 python3.9[183235]: ansible-ansible.builtin.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:57:50 np0005554845 python3.9[183387]: ansible-ansible.legacy.stat Invoked with path=/var/lib/nova/compute_id follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:50 np0005554845 python3.9[183510]: ansible-ansible.legacy.copy Invoked with attributes=+i dest=/var/lib/nova/compute_id group=nova mode=0400 owner=nova src=/home/zuul/.ansible/tmp/ansible-tmp-1765432669.520499-3738-232472901154170/.source _original_basename=.8ur8eneq follow=False checksum=2516c09f4f6d3d0651b0828bdc7152dd2033bf78 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None
Dec 11 00:57:51 np0005554845 python3.9[183662]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:57:52 np0005554845 python3.9[183814]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:53 np0005554845 python3.9[183935]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute.json mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432672.265014-3815-156537648229119/.source.json follow=False _original_basename=nova_compute.json.j2 checksum=211ffd0bca4b407eb4de45a749ef70116a7806fd backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:54 np0005554845 python3.9[184085]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/containers/nova_compute_init.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:57:54 np0005554845 python3.9[184206]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/containers/nova_compute_init.json mode=0700 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432673.5323448-3860-70632080547660/.source.json follow=False _original_basename=nova_compute_init.json.j2 checksum=60b024e6db49dc6e700fc0d50263944d98d4c034 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:57:55 np0005554845 python3.9[184358]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute_init.json debug=False
Dec 11 00:57:56 np0005554845 python3.9[184510]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:57:58 np0005554845 python3[184662]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute_init.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:57:58 np0005554845 podman[184699]: 2025-12-11 05:57:58.201041568 +0000 UTC m=+0.048260220 container create 3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, container_name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.vendor=CentOS)
Dec 11 00:57:58 np0005554845 podman[184699]: 2025-12-11 05:57:58.173597389 +0000 UTC m=+0.020816041 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 11 00:57:58 np0005554845 python3[184662]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute_init --conmon-pidfile /run/nova_compute_init.pid --env NOVA_STATEDIR_OWNERSHIP_SKIP=/var/lib/nova/compute_id --env __OS_DEBUG=False --label config_id=edpm --label container_name=nova_compute_init --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']} --log-driver journald --log-level info --network none --privileged=False --security-opt label=disable --user root --volume /dev/log:/dev/log --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z --volume /var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init
Dec 11 00:57:59 np0005554845 python3.9[184888]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:00 np0005554845 podman[185042]: 2025-12-11 05:58:00.344364118 +0000 UTC m=+0.066607691 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 00:58:00 np0005554845 podman[185043]: 2025-12-11 05:58:00.372559848 +0000 UTC m=+0.100849032 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202)
Dec 11 00:58:00 np0005554845 python3.9[185044]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/containers config_pattern=nova_compute.json debug=False
Dec 11 00:58:01 np0005554845 python3.9[185239]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:58:02 np0005554845 podman[185363]: 2025-12-11 05:58:02.373358465 +0000 UTC m=+0.091233529 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:58:02 np0005554845 python3[185407]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/containers config_id=edpm config_overrides={} config_patterns=nova_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:58:03 np0005554845 podman[185448]: 2025-12-11 05:58:02.92092657 +0000 UTC m=+0.028285283 image pull e3166cc074f328e3b121ff82d56ed43a2542af699baffe6874520fe3837c2b18 quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified
Dec 11 00:58:03 np0005554845 podman[185448]: 2025-12-11 05:58:03.053151601 +0000 UTC m=+0.160510224 container create 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=nova_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 11 00:58:03 np0005554845 python3[185407]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name nova_compute --conmon-pidfile /run/nova_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --label config_id=edpm --label container_name=nova_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']} --log-driver journald --log-level info --network host --pid host --privileged=True --user nova --volume /var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro --volume /var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /etc/localtime:/etc/localtime:ro --volume /lib/modules:/lib/modules:ro --volume /dev:/dev --volume /var/lib/libvirt:/var/lib/libvirt --volume /run/libvirt:/run/libvirt:shared --volume /var/lib/nova:/var/lib/nova:shared --volume /var/lib/iscsi:/var/lib/iscsi --volume /etc/multipath:/etc/multipath:z --volume /etc/multipath.conf:/etc/multipath.conf:ro --volume /etc/iscsi:/etc/iscsi:ro --volume /etc/nvme:/etc/nvme --volume /var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro --volume /etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified kolla_start
Dec 11 00:58:03 np0005554845 python3.9[185639]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:04 np0005554845 python3.9[185793]: ansible-file Invoked with path=/etc/systemd/system/edpm_nova_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:05 np0005554845 python3.9[185944]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432684.9510546-4135-60340722202711/source dest=/etc/systemd/system/edpm_nova_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:06 np0005554845 python3.9[186020]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:58:06 np0005554845 systemd[1]: Reloading.
Dec 11 00:58:06 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:58:06 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:58:07 np0005554845 python3.9[186130]: ansible-systemd Invoked with state=restarted name=edpm_nova_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:58:07 np0005554845 systemd[1]: Reloading.
Dec 11 00:58:07 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:58:07 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:58:07 np0005554845 systemd[1]: Starting nova_compute container...
Dec 11 00:58:07 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:58:07 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:07 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:07 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:07 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:07 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:07 np0005554845 podman[186171]: 2025-12-11 05:58:07.882258463 +0000 UTC m=+0.174075790 container init 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, org.label-schema.build-date=20251202)
Dec 11 00:58:07 np0005554845 podman[186171]: 2025-12-11 05:58:07.888302805 +0000 UTC m=+0.180120102 container start 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute)
Dec 11 00:58:07 np0005554845 nova_compute[186187]: + sudo -E kolla_set_configs
Dec 11 00:58:07 np0005554845 podman[186171]: nova_compute
Dec 11 00:58:07 np0005554845 systemd[1]: Started nova_compute container.
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Validating config file
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying service configuration files
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Deleting /etc/ceph
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Creating directory /etc/ceph
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /etc/ceph
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Writing out command to execute
Dec 11 00:58:07 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 11 00:58:08 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:08 np0005554845 nova_compute[186187]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 00:58:08 np0005554845 nova_compute[186187]: ++ cat /run_command
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + CMD=nova-compute
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + ARGS=
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + sudo kolla_copy_cacerts
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + [[ ! -n '' ]]
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + . kolla_extend_start
Dec 11 00:58:08 np0005554845 nova_compute[186187]: Running command: 'nova-compute'
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + echo 'Running command: '\''nova-compute'\'''
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + umask 0022
Dec 11 00:58:08 np0005554845 nova_compute[186187]: + exec nova-compute
Dec 11 00:58:09 np0005554845 python3.9[186349]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner_healthcheck.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.087 186191 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.088 186191 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.088 186191 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.089 186191 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.232 186191 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.261 186191 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.029s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.261 186191 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 00:58:10 np0005554845 python3.9[186503]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.787 186191 INFO nova.virt.driver [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.920 186191 INFO nova.compute.provider_config [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.932 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.933 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.933 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.933 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.933 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.933 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.934 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.935 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.936 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.937 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.938 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.939 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.940 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.941 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.942 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.943 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.944 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.945 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.946 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.947 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.948 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.949 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.950 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.951 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.952 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.953 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.954 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.955 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.956 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.957 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.958 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.959 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.960 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.961 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.962 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.963 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.964 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.965 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.966 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.967 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.968 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.969 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.970 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.971 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.972 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.973 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.974 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.975 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.976 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.977 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.978 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.979 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.980 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.981 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.982 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.983 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.984 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.985 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.986 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.987 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.988 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.989 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.990 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.991 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.992 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.993 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.994 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.995 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.996 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.997 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.998 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:10 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:10.999 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.000 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.001 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.002 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 WARNING oslo_config.cfg [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 11 00:58:11 np0005554845 nova_compute[186187]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 11 00:58:11 np0005554845 nova_compute[186187]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 11 00:58:11 np0005554845 nova_compute[186187]: and ``live_migration_inbound_addr`` respectively.
Dec 11 00:58:11 np0005554845 nova_compute[186187]: ).  Its value may be silently ignored in the future.#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.003 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.004 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.005 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.006 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.007 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.008 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.009 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.010 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.011 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.012 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.013 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.014 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.015 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.016 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.017 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.018 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.019 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.020 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.021 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.022 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.023 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.024 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.025 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.026 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.027 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.028 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.029 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.030 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.031 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.032 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.033 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.034 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.035 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.036 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.037 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.038 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.039 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.040 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.041 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.042 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.043 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.044 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.045 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.046 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.047 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.048 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.049 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.050 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.051 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.052 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.053 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.054 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.055 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.056 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.057 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.058 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.059 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.060 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.061 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.061 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.061 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.061 186191 DEBUG oslo_service.service [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.062 186191 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.076 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.077 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.077 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.077 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec 11 00:58:11 np0005554845 systemd[1]: Starting libvirt QEMU daemon...
Dec 11 00:58:11 np0005554845 systemd[1]: Started libvirt QEMU daemon.
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.163 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7faebefe05b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.166 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7faebefe05b0> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.168 186191 INFO nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Connection event '1' reason 'None'#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.183 186191 WARNING nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Cannot update service status on host "compute-2.ctlplane.example.com" since it is not registered.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec 11 00:58:11 np0005554845 nova_compute[186187]: 2025-12-11 05:58:11.184 186191 DEBUG nova.virt.libvirt.volume.mount [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec 11 00:58:11 np0005554845 python3.9[186697]: ansible-ansible.builtin.stat Invoked with path=/etc/systemd/system/edpm_nova_nvme_cleaner.service.requires follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.127 186191 INFO nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt host capabilities <capabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <host>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <uuid>a858e7f7-0b50-46e3-b377-83982336f3bb</uuid>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <arch>x86_64</arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model>EPYC-Rome-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <vendor>AMD</vendor>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <microcode version='16777317'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <signature family='23' model='49' stepping='0'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <maxphysaddr mode='emulate' bits='40'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='x2apic'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='tsc-deadline'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='osxsave'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='hypervisor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='tsc_adjust'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='spec-ctrl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='stibp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='arch-capabilities'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='cmp_legacy'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='topoext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='virt-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='lbrv'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='tsc-scale'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='vmcb-clean'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='pause-filter'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='pfthreshold'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='svme-addr-chk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='rdctl-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='skip-l1dfl-vmentry'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='mds-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature name='pschange-mc-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <pages unit='KiB' size='4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <pages unit='KiB' size='2048'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <pages unit='KiB' size='1048576'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <power_management>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <suspend_mem/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <suspend_disk/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <suspend_hybrid/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </power_management>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <iommu support='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <migration_features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <live/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <uri_transports>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <uri_transport>tcp</uri_transport>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <uri_transport>rdma</uri_transport>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </uri_transports>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </migration_features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <topology>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <cells num='1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <cell id='0'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <memory unit='KiB'>7864308</memory>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <pages unit='KiB' size='4'>1966077</pages>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <pages unit='KiB' size='2048'>0</pages>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <pages unit='KiB' size='1048576'>0</pages>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <distances>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <sibling id='0' value='10'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          </distances>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          <cpus num='8'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:          </cpus>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        </cell>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </cells>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </topology>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <cache>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </cache>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <secmodel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model>selinux</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <doi>0</doi>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </secmodel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <secmodel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model>dac</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <doi>0</doi>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </secmodel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </host>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <guest>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <os_type>hvm</os_type>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <arch name='i686'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <wordsize>32</wordsize>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <domain type='qemu'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <domain type='kvm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <pae/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <nonpae/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <acpi default='on' toggle='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <apic default='on' toggle='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <cpuselection/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <deviceboot/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <disksnapshot default='on' toggle='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <externalSnapshot/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </guest>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <guest>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <os_type>hvm</os_type>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <arch name='x86_64'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <wordsize>64</wordsize>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <domain type='qemu'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <domain type='kvm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <acpi default='on' toggle='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <apic default='on' toggle='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <cpuselection/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <deviceboot/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <disksnapshot default='on' toggle='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <externalSnapshot/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </guest>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </capabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: #033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.133 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Getting domain capabilities for i686 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.159 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 11 00:58:12 np0005554845 nova_compute[186187]: <domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <domain>kvm</domain>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <arch>i686</arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <vcpu max='4096'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <iothreads supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <os supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='firmware'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <loader supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>rom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pflash</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='readonly'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>yes</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='secure'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </loader>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </os>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='maximumMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <vendor>AMD</vendor>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='succor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='custom' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-128'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-256'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-512'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <memoryBacking supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='sourceType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>anonymous</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>memfd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </memoryBacking>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <disk supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='diskDevice'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>disk</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cdrom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>floppy</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>lun</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>fdc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>sata</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </disk>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <graphics supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vnc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egl-headless</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </graphics>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <video supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='modelType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vga</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cirrus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>none</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>bochs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ramfb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </video>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hostdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='mode'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>subsystem</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='startupPolicy'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>mandatory</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>requisite</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>optional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='subsysType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pci</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='capsType'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='pciBackend'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hostdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <rng supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>random</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </rng>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <filesystem supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='driverType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>path</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>handle</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtiofs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </filesystem>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <tpm supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-tis</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-crb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emulator</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>external</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendVersion'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>2.0</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </tpm>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <redirdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </redirdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <channel supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </channel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <crypto supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </crypto>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <interface supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>passt</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </interface>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <panic supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>isa</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>hyperv</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </panic>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <console supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>null</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dev</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pipe</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stdio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>udp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tcp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu-vdagent</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </console>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <gic supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <genid supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backup supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <async-teardown supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <ps2 supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sev supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sgx supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hyperv supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='features'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>relaxed</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vapic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>spinlocks</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vpindex</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>runtime</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>synic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stimer</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reset</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vendor_id</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>frequencies</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reenlightenment</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tlbflush</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ipi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>avic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emsr_bitmap</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>xmm_input</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hyperv>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <launchSecurity supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='sectype'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tdx</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </launchSecurity>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.169 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 11 00:58:12 np0005554845 nova_compute[186187]: <domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <domain>kvm</domain>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <arch>i686</arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <vcpu max='240'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <iothreads supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <os supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='firmware'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <loader supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>rom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pflash</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='readonly'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>yes</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='secure'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </loader>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </os>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='maximumMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <vendor>AMD</vendor>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='succor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='custom' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-128'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-256'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-512'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <memoryBacking supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='sourceType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>anonymous</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>memfd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </memoryBacking>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <disk supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='diskDevice'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>disk</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cdrom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>floppy</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>lun</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ide</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>fdc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>sata</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </disk>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <graphics supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vnc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egl-headless</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </graphics>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <video supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='modelType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vga</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cirrus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>none</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>bochs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ramfb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </video>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hostdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='mode'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>subsystem</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='startupPolicy'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>mandatory</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>requisite</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>optional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='subsysType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pci</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='capsType'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='pciBackend'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hostdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <rng supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>random</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </rng>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <filesystem supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='driverType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>path</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>handle</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtiofs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </filesystem>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <tpm supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-tis</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-crb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emulator</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>external</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendVersion'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>2.0</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </tpm>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <redirdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </redirdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <channel supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </channel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <crypto supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </crypto>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <interface supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>passt</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </interface>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <panic supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>isa</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>hyperv</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </panic>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <console supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>null</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dev</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pipe</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stdio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>udp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tcp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu-vdagent</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </console>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <gic supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <genid supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backup supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <async-teardown supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <ps2 supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sev supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sgx supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hyperv supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='features'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>relaxed</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vapic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>spinlocks</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vpindex</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>runtime</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>synic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stimer</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reset</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vendor_id</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>frequencies</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reenlightenment</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tlbflush</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ipi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>avic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emsr_bitmap</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>xmm_input</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hyperv>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <launchSecurity supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='sectype'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tdx</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </launchSecurity>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.195 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Getting domain capabilities for x86_64 via machine types: {'q35', 'pc'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.199 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 11 00:58:12 np0005554845 nova_compute[186187]: <domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <domain>kvm</domain>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <arch>x86_64</arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <vcpu max='4096'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <iothreads supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <os supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='firmware'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>efi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <loader supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>rom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pflash</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='readonly'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>yes</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='secure'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>yes</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </loader>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </os>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='maximumMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <vendor>AMD</vendor>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='succor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='custom' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-128'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-256'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-512'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <memoryBacking supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='sourceType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>anonymous</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>memfd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </memoryBacking>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <disk supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='diskDevice'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>disk</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cdrom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>floppy</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>lun</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>fdc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>sata</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </disk>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <graphics supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vnc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egl-headless</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </graphics>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <video supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='modelType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vga</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cirrus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>none</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>bochs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ramfb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </video>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hostdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='mode'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>subsystem</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='startupPolicy'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>mandatory</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>requisite</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>optional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='subsysType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pci</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='capsType'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='pciBackend'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hostdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <rng supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>random</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </rng>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <filesystem supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='driverType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>path</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>handle</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtiofs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </filesystem>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <tpm supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-tis</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-crb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emulator</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>external</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendVersion'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>2.0</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </tpm>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <redirdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </redirdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <channel supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </channel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <crypto supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </crypto>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <interface supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>passt</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </interface>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <panic supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>isa</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>hyperv</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </panic>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <console supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>null</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dev</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pipe</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stdio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>udp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tcp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu-vdagent</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </console>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <gic supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <genid supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backup supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <async-teardown supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <ps2 supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sev supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sgx supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hyperv supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='features'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>relaxed</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vapic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>spinlocks</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vpindex</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>runtime</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>synic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stimer</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reset</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vendor_id</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>frequencies</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reenlightenment</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tlbflush</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ipi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>avic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emsr_bitmap</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>xmm_input</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hyperv>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <launchSecurity supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='sectype'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tdx</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </launchSecurity>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.264 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 11 00:58:12 np0005554845 nova_compute[186187]: <domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <domain>kvm</domain>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <arch>x86_64</arch>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <vcpu max='240'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <iothreads supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <os supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='firmware'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <loader supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>rom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pflash</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='readonly'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>yes</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='secure'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>no</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </loader>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </os>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='maximumMigratable'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>on</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>off</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <vendor>AMD</vendor>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='succor'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <mode name='custom' supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Denverton-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='auto-ibrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amd-psfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='stibp-always-on'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='EPYC-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-128'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-256'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx10-512'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='prefetchiti'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Haswell-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512er'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512pf'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fma4'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tbm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xop'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='amx-tile'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-bf16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-fp16'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bitalg'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrc'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fzrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='la57'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='taa-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xfd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ifma'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cmpccxadd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fbsdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='fsrs'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ibrs-all'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mcdt-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pbrsb-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='psdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='serialize'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vaes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='hle'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='rtm'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512bw'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512cd'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512dq'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512f'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='avx512vl'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='invpcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pcid'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='pku'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='mpx'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='core-capability'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='split-lock-detect'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='cldemote'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='erms'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='gfni'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdir64b'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='movdiri'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='xsaves'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='athlon-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='core2duo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='coreduo-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='n270-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='ss'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <blockers model='phenom-v1'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnow'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <feature name='3dnowext'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </blockers>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </mode>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <memoryBacking supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <enum name='sourceType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>anonymous</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <value>memfd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </memoryBacking>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <disk supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='diskDevice'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>disk</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cdrom</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>floppy</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>lun</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ide</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>fdc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>sata</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </disk>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <graphics supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vnc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egl-headless</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </graphics>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <video supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='modelType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vga</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>cirrus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>none</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>bochs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ramfb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </video>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hostdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='mode'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>subsystem</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='startupPolicy'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>mandatory</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>requisite</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>optional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='subsysType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pci</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>scsi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='capsType'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='pciBackend'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hostdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <rng supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtio-non-transitional</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>random</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>egd</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </rng>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <filesystem supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='driverType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>path</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>handle</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>virtiofs</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </filesystem>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <tpm supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-tis</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tpm-crb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emulator</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>external</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendVersion'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>2.0</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </tpm>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <redirdev supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='bus'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>usb</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </redirdev>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <channel supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </channel>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <crypto supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendModel'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>builtin</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </crypto>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <interface supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='backendType'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>default</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>passt</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </interface>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <panic supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='model'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>isa</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>hyperv</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </panic>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <console supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='type'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>null</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vc</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pty</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dev</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>file</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>pipe</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stdio</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>udp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tcp</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>unix</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>qemu-vdagent</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>dbus</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </console>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </devices>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <gic supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <genid supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <backup supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <async-teardown supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <ps2 supported='yes'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sev supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <sgx supported='no'/>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <hyperv supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='features'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>relaxed</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vapic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>spinlocks</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vpindex</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>runtime</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>synic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>stimer</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reset</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>vendor_id</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>frequencies</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>reenlightenment</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tlbflush</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>ipi</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>avic</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>emsr_bitmap</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>xmm_input</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </defaults>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </hyperv>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    <launchSecurity supported='yes'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      <enum name='sectype'>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:        <value>tdx</value>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:      </enum>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:    </launchSecurity>
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  </features>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </domainCapabilities>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.332 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.333 186191 INFO nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Secure Boot support detected#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.336 186191 INFO nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.337 186191 INFO nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.349 186191 DEBUG nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] cpu compare xml: <cpu match="exact">
Dec 11 00:58:12 np0005554845 nova_compute[186187]:  <model>Nehalem</model>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: </cpu>
Dec 11 00:58:12 np0005554845 nova_compute[186187]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.352 186191 DEBUG nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.377 186191 INFO nova.virt.node [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Determined node identity eece7817-9d4f-4ebe-96c8-a659f76170f9 from /var/lib/nova/compute_id#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.393 186191 WARNING nova.compute.manager [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Compute nodes ['eece7817-9d4f-4ebe-96c8-a659f76170f9'] for host compute-2.ctlplane.example.com were not found in the database. If this is the first time this service is starting on this host, then you can ignore this warning.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.421 186191 INFO nova.compute.manager [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.456 186191 WARNING nova.compute.manager [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] No compute node record found for host compute-2.ctlplane.example.com. If this is the first time this service is starting on this host, then you can ignore this warning.: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host compute-2.ctlplane.example.com could not be found.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.456 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.457 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.457 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.457 186191 DEBUG nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 00:58:12 np0005554845 systemd[1]: Starting libvirt nodedev daemon...
Dec 11 00:58:12 np0005554845 systemd[1]: Started libvirt nodedev daemon.
Dec 11 00:58:12 np0005554845 python3.9[186869]: ansible-containers.podman.podman_container Invoked with name=nova_nvme_cleaner state=absent executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.752 186191 WARNING nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.753 186191 DEBUG nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6178MB free_disk=73.53269958496094GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.753 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.753 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.789 186191 WARNING nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] No compute node record for compute-2.ctlplane.example.com:eece7817-9d4f-4ebe-96c8-a659f76170f9: nova.exception_Remote.ComputeHostNotFound_Remote: Compute host eece7817-9d4f-4ebe-96c8-a659f76170f9 could not be found.#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.815 186191 INFO nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Compute node record created for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com with uuid: eece7817-9d4f-4ebe-96c8-a659f76170f9#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.877 186191 DEBUG nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 00:58:12 np0005554845 nova_compute[186187]: 2025-12-11 05:58:12.877 186191 DEBUG nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 00:58:12 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.367 186191 INFO nova.scheduler.client.report [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] [req-c3f9284d-feee-4556-bca9-94abb2900fbf] Created resource provider record via placement API for resource provider with UUID eece7817-9d4f-4ebe-96c8-a659f76170f9 and name compute-2.ctlplane.example.com.#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.417 186191 DEBUG nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 11 00:58:13 np0005554845 nova_compute[186187]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.417 186191 INFO nova.virt.libvirt.host [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.418 186191 DEBUG nova.compute.provider_tree [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.419 186191 DEBUG nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.423 186191 DEBUG nova.virt.libvirt.driver [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Libvirt baseline CPU <cpu>
Dec 11 00:58:13 np0005554845 nova_compute[186187]:  <arch>x86_64</arch>
Dec 11 00:58:13 np0005554845 nova_compute[186187]:  <model>Nehalem</model>
Dec 11 00:58:13 np0005554845 nova_compute[186187]:  <vendor>AMD</vendor>
Dec 11 00:58:13 np0005554845 nova_compute[186187]:  <topology sockets="8" cores="1" threads="1"/>
Dec 11 00:58:13 np0005554845 nova_compute[186187]: </cpu>
Dec 11 00:58:13 np0005554845 nova_compute[186187]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.467 186191 DEBUG nova.scheduler.client.report [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Updated inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with generation 0 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 0}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.468 186191 DEBUG nova.compute.provider_tree [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Updating resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 generation from 0 to 1 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.468 186191 DEBUG nova.compute.provider_tree [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.541 186191 DEBUG nova.compute.provider_tree [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Updating resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 generation from 1 to 2 during operation: update_traits _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.567 186191 DEBUG nova.compute.resource_tracker [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.567 186191 DEBUG oslo_concurrency.lockutils [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.814s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.567 186191 DEBUG nova.service [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec 11 00:58:13 np0005554845 python3.9[187066]: ansible-ansible.builtin.systemd Invoked with name=edpm_nova_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.655 186191 DEBUG nova.service [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.655 186191 DEBUG nova.servicegroup.drivers.db [None req-e5ad203c-8cae-4813-9e46-b842a3c3368f - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec 11 00:58:13 np0005554845 systemd[1]: Stopping nova_compute container...
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.795 186191 WARNING amqp [-] Received method (60, 30) during closing channel 1. This method will be ignored#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.798 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.798 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 00:58:13 np0005554845 nova_compute[186187]: 2025-12-11 05:58:13.798 186191 DEBUG oslo_concurrency.lockutils [None req-ee8a5d92-4c2b-4742-883d-3e289d493e7c - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 00:58:14 np0005554845 virtqemud[186638]: libvirt version: 11.9.0, package: 1.el9 (builder@centos.org, 2025-11-04-09:54:50, )
Dec 11 00:58:14 np0005554845 virtqemud[186638]: hostname: compute-2
Dec 11 00:58:14 np0005554845 virtqemud[186638]: End of file while reading data: Input/output error
Dec 11 00:58:14 np0005554845 systemd[1]: libpod-8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224.scope: Deactivated successfully.
Dec 11 00:58:14 np0005554845 systemd[1]: libpod-8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224.scope: Consumed 3.380s CPU time.
Dec 11 00:58:14 np0005554845 podman[187070]: 2025-12-11 05:58:14.215819938 +0000 UTC m=+0.492924226 container died 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=nova_compute, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, org.label-schema.build-date=20251202)
Dec 11 00:58:14 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224-userdata-shm.mount: Deactivated successfully.
Dec 11 00:58:14 np0005554845 systemd[1]: var-lib-containers-storage-overlay-71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d-merged.mount: Deactivated successfully.
Dec 11 00:58:14 np0005554845 podman[187070]: 2025-12-11 05:58:14.285009531 +0000 UTC m=+0.562113779 container cleanup 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=edpm, container_name=nova_compute, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, org.label-schema.schema-version=1.0)
Dec 11 00:58:14 np0005554845 podman[187070]: nova_compute
Dec 11 00:58:14 np0005554845 podman[187100]: nova_compute
Dec 11 00:58:14 np0005554845 systemd[1]: edpm_nova_compute.service: Deactivated successfully.
Dec 11 00:58:14 np0005554845 systemd[1]: Stopped nova_compute container.
Dec 11 00:58:14 np0005554845 systemd[1]: Starting nova_compute container...
Dec 11 00:58:14 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:58:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/etc/nvme supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/etc/multipath supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/iscsi supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/71b0756e406d514a584060d166ab3eda0201a538887ce2728bc4437e19359d8d/merged/var/lib/libvirt supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:14 np0005554845 podman[187113]: 2025-12-11 05:58:14.487948298 +0000 UTC m=+0.091862026 container init 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm)
Dec 11 00:58:14 np0005554845 podman[187113]: 2025-12-11 05:58:14.495267106 +0000 UTC m=+0.099180804 container start 8f9d1bf61a04ea3d56c20556de75b47d6923bc71eba63b09b075a8c8fb4c9224 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': True, 'user': 'nova', 'restart': 'always', 'command': 'kolla_start', 'net': 'host', 'pid': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'volumes': ['/var/lib/openstack/config/nova:/var/lib/kolla/config_files:ro', '/var/lib/openstack/cacerts/nova/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/etc/localtime:/etc/localtime:ro', '/lib/modules:/lib/modules:ro', '/dev:/dev', '/var/lib/libvirt:/var/lib/libvirt', '/run/libvirt:/run/libvirt:shared', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/etc/iscsi:/etc/iscsi:ro', '/etc/nvme:/etc/nvme', '/var/lib/openstack/config/ceph:/var/lib/kolla/config_files/ceph:ro', '/etc/ssh/ssh_known_hosts:/etc/ssh/ssh_known_hosts:ro']}, container_name=nova_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 00:58:14 np0005554845 podman[187113]: nova_compute
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + sudo -E kolla_set_configs
Dec 11 00:58:14 np0005554845 systemd[1]: Started nova_compute container.
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Validating config file
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying service configuration files
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/nova/nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/nova/nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/nova/nova.conf.d/01-nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/01-nova.conf to /etc/nova/nova.conf.d/01-nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/01-nova.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/25-nova-extra.conf to /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/25-nova-extra.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/nova-blank.conf to /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/nova-blank.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/02-nova-host-specific.conf to /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/nova/nova.conf.d/02-nova-host-specific.conf
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /etc/ceph
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Creating directory /etc/ceph
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /etc/ceph
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-privatekey to /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /var/lib/nova/.ssh/config
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/ssh-config to /var/lib/nova/.ssh/config
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Deleting /usr/sbin/iscsiadm
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Copying /var/lib/kolla/config_files/run-on-host to /usr/sbin/iscsiadm
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /usr/sbin/iscsiadm
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Writing out command to execute
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:14 np0005554845 nova_compute[187128]: INFO:__main__:Setting permission for /var/lib/nova/.ssh/config
Dec 11 00:58:14 np0005554845 nova_compute[187128]: ++ cat /run_command
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + CMD=nova-compute
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + ARGS=
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + sudo kolla_copy_cacerts
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + [[ ! -n '' ]]
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + . kolla_extend_start
Dec 11 00:58:14 np0005554845 nova_compute[187128]: Running command: 'nova-compute'
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + echo 'Running command: '\''nova-compute'\'''
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + umask 0022
Dec 11 00:58:14 np0005554845 nova_compute[187128]: + exec nova-compute
Dec 11 00:58:15 np0005554845 python3.9[187291]: ansible-containers.podman.podman_container Invoked with name=nova_compute_init state=started executable=podman detach=True debug=False force_restart=False force_delete=True generate_systemd={} image_strict=False recreate=False image=None annotation=None arch=None attach=None authfile=None blkio_weight=None blkio_weight_device=None cap_add=None cap_drop=None cgroup_conf=None cgroup_parent=None cgroupns=None cgroups=None chrootdirs=None cidfile=None cmd_args=None conmon_pidfile=None command=None cpu_period=None cpu_quota=None cpu_rt_period=None cpu_rt_runtime=None cpu_shares=None cpus=None cpuset_cpus=None cpuset_mems=None decryption_key=None delete_depend=None delete_time=None delete_volumes=None detach_keys=None device=None device_cgroup_rule=None device_read_bps=None device_read_iops=None device_write_bps=None device_write_iops=None dns=None dns_option=None dns_search=None entrypoint=None env=None env_file=None env_host=None env_merge=None etc_hosts=None expose=None gidmap=None gpus=None group_add=None group_entry=None healthcheck=None healthcheck_interval=None healthcheck_retries=None healthcheck_start_period=None health_startup_cmd=None health_startup_interval=None health_startup_retries=None health_startup_success=None health_startup_timeout=None healthcheck_timeout=None healthcheck_failure_action=None hooks_dir=None hostname=None hostuser=None http_proxy=None image_volume=None init=None init_ctr=None init_path=None interactive=None ip=None ip6=None ipc=None kernel_memory=None label=None label_file=None log_driver=None log_level=None log_opt=None mac_address=None memory=None memory_reservation=None memory_swap=None memory_swappiness=None mount=None network=None network_aliases=None no_healthcheck=None no_hosts=None oom_kill_disable=None oom_score_adj=None os=None passwd=None passwd_entry=None personality=None pid=None pid_file=None pids_limit=None platform=None pod=None pod_id_file=None preserve_fd=None preserve_fds=None privileged=None publish=None publish_all=None pull=None quadlet_dir=None quadlet_filename=None quadlet_file_mode=None quadlet_options=None rdt_class=None read_only=None read_only_tmpfs=None requires=None restart_policy=None restart_time=None retry=None retry_delay=None rm=None rmi=None rootfs=None seccomp_policy=None secrets=NOT_LOGGING_PARAMETER sdnotify=None security_opt=None shm_size=None shm_size_systemd=None sig_proxy=None stop_signal=None stop_timeout=None stop_time=None subgidname=None subuidname=None sysctl=None systemd=None timeout=None timezone=None tls_verify=None tmpfs=None tty=None uidmap=None ulimit=None umask=None unsetenv=None unsetenv_all=None user=None userns=None uts=None variant=None volume=None volumes_from=None workdir=None
Dec 11 00:58:15 np0005554845 systemd[1]: Started libpod-conmon-3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066.scope.
Dec 11 00:58:15 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:58:15 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fc354cfc6eb0284796b7ae65aa278d657f1ea2e0af7d4f4aa63c5e72527ceb1/merged/usr/sbin/nova_statedir_ownership.py supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:15 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fc354cfc6eb0284796b7ae65aa278d657f1ea2e0af7d4f4aa63c5e72527ceb1/merged/var/lib/nova supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:15 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8fc354cfc6eb0284796b7ae65aa278d657f1ea2e0af7d4f4aa63c5e72527ceb1/merged/var/lib/_nova_secontext supports timestamps until 2038 (0x7fffffff)
Dec 11 00:58:15 np0005554845 podman[187318]: 2025-12-11 05:58:15.649981858 +0000 UTC m=+0.131594435 container init 3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 11 00:58:15 np0005554845 podman[187318]: 2025-12-11 05:58:15.657575264 +0000 UTC m=+0.139187831 container start 3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, container_name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 00:58:15 np0005554845 python3.9[187291]: ansible-containers.podman.podman_container PODMAN-CONTAINER-DEBUG: podman start nova_compute_init
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Applying nova statedir ownership
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Target ownership for /var/lib/nova: 42436:42436
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Changing ownership of /var/lib/nova from 1000:1000 to 42436:42436
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova to system_u:object_r:container_file_t:s0
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Checking uid: 1000 gid: 1000 path: /var/lib/nova/instances/
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Changing ownership of /var/lib/nova/instances from 1000:1000 to 42436:42436
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/instances to system_u:object_r:container_file_t:s0
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Ownership of /var/lib/nova/.ssh already 42436:42436
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Setting selinux context of /var/lib/nova/.ssh to system_u:object_r:container_file_t:s0
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/ssh-privatekey
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Checking uid: 42436 gid: 42436 path: /var/lib/nova/.ssh/config
Dec 11 00:58:15 np0005554845 nova_compute_init[187338]: INFO:nova_statedir:Nova statedir ownership complete
Dec 11 00:58:15 np0005554845 systemd[1]: libpod-3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066.scope: Deactivated successfully.
Dec 11 00:58:15 np0005554845 podman[187357]: 2025-12-11 05:58:15.776164608 +0000 UTC m=+0.037831335 container died 3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=nova_compute_init, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 00:58:15 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066-userdata-shm.mount: Deactivated successfully.
Dec 11 00:58:15 np0005554845 systemd[1]: var-lib-containers-storage-overlay-8fc354cfc6eb0284796b7ae65aa278d657f1ea2e0af7d4f4aa63c5e72527ceb1-merged.mount: Deactivated successfully.
Dec 11 00:58:15 np0005554845 podman[187357]: 2025-12-11 05:58:15.804290075 +0000 UTC m=+0.065956772 container cleanup 3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066 (image=quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified, name=nova_compute_init, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=nova_compute_init, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-nova-compute:current-podified', 'privileged': False, 'user': 'root', 'restart': 'never', 'command': 'bash -c $* -- eval python3 /sbin/nova_statedir_ownership.py | logger -t nova_compute_init', 'net': 'none', 'security_opt': ['label=disable'], 'detach': False, 'environment': {'NOVA_STATEDIR_OWNERSHIP_SKIP': '/var/lib/nova/compute_id', '__OS_DEBUG': False}, 'volumes': ['/dev/log:/dev/log', '/var/lib/nova:/var/lib/nova:shared', '/var/lib/_nova_secontext:/var/lib/_nova_secontext:shared,z', '/var/lib/openstack/config/nova/nova_statedir_ownership.py:/sbin/nova_statedir_ownership.py:z']}, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 00:58:15 np0005554845 systemd[1]: libpod-conmon-3764a6d7e3369d4d867089ab2fb6d42aa6760e1e242de1f610cf05070e000066.scope: Deactivated successfully.
Dec 11 00:58:16 np0005554845 systemd[1]: session-25.scope: Deactivated successfully.
Dec 11 00:58:16 np0005554845 systemd[1]: session-25.scope: Consumed 2min 2.532s CPU time.
Dec 11 00:58:16 np0005554845 systemd-logind[789]: Session 25 logged out. Waiting for processes to exit.
Dec 11 00:58:16 np0005554845 systemd-logind[789]: Removed session 25.
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.535 187132 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_linux_bridge.linux_bridge.LinuxBridgePlugin'>' with name 'linux_bridge' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.535 187132 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_noop.noop.NoOpPlugin'>' with name 'noop' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.535 187132 DEBUG os_vif [-] Loaded VIF plugin class '<class 'vif_plug_ovs.ovs.OvsPlugin'>' with name 'ovs' initialize /usr/lib/python3.9/site-packages/os_vif/__init__.py:44#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.535 187132 INFO os_vif [-] Loaded VIF plugins: linux_bridge, noop, ovs#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.659 187132 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): grep -F node.session.scan /sbin/iscsiadm execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.681 187132 DEBUG oslo_concurrency.processutils [-] CMD "grep -F node.session.scan /sbin/iscsiadm" returned: 1 in 0.023s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 00:58:16 np0005554845 nova_compute[187128]: 2025-12-11 05:58:16.682 187132 DEBUG oslo_concurrency.processutils [-] 'grep -F node.session.scan /sbin/iscsiadm' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.220 187132 INFO nova.virt.driver [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Loading compute driver 'libvirt.LibvirtDriver'#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.343 187132 INFO nova.compute.provider_config [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] No provider configs found in /etc/nova/provider_config/. If files are present, ensure the Nova process has access.#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.350 187132 DEBUG oslo_concurrency.lockutils [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Acquiring lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.351 187132 DEBUG oslo_concurrency.lockutils [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Acquired lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.351 187132 DEBUG oslo_concurrency.lockutils [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Releasing lock "singleton_lock" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.351 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Full set of CONF: _wait_for_exit_or_signal /usr/lib/python3.9/site-packages/oslo_service/service.py:362#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.351 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.351 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] command line args: [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] config files: ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] allow_resize_to_same_host      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] arq_binding_timeout            = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] backdoor_port                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] backdoor_socket                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.352 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] block_device_allocate_retries  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] block_device_allocate_retries_interval = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cert                           = self.pem log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute_driver                 = libvirt.LibvirtDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute_monitors               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] config_dir                     = ['/etc/nova/nova.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] config_drive_format            = iso9660 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.353 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] config_file                    = ['/etc/nova/nova.conf', '/etc/nova/nova-compute.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] console_host                   = compute-2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] control_exchange               = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cpu_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] daemon                         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.354 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] default_access_ip_network_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] default_availability_zone      = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] default_ephemeral_format       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'glanceclient=WARN', 'oslo.privsep.daemon=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] default_schedule_zone          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.355 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] disk_allocation_ratio          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.356 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] enable_new_services            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.356 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] enabled_apis                   = ['osapi_compute', 'metadata'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.356 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] enabled_ssl_apis               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.356 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] flat_injected                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.356 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] force_config_drive             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] force_raw_images               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] heal_instance_info_cache_interval = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] initial_cpu_allocation_ratio   = 4.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.357 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] initial_disk_allocation_ratio  = 0.9 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] initial_ram_allocation_ratio   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] injected_network_template      = /usr/lib/python3.9/site-packages/nova/virt/interfaces.template log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_build_timeout         = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_delete_interval       = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_name_template         = instance-%08x log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.358 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_usage_audit           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.359 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_usage_audit_period    = month log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.359 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.359 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] instances_path                 = /var/lib/nova/instances log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.359 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] internal_service_availability_zone = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.359 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] key                            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] live_migration_retry_count     = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_dir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_file                       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.360 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] log_rotation_type              = size log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.361 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] long_rpc_timeout               = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_concurrent_builds          = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_concurrent_live_migrations = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_concurrent_snapshots       = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.362 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_local_block_devices        = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_logfile_count              = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] max_logfile_size_mb            = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] maximum_instance_delete_attempts = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metadata_listen                = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metadata_listen_port           = 8775 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metadata_workers               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.363 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] migrate_max_retries            = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] mkisofs_cmd                    = /usr/bin/mkisofs log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] my_block_storage_ip            = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] my_ip                          = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] network_allocate_retries       = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] non_inheritable_image_properties = ['cache_in_nova', 'bittorrent'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] osapi_compute_listen           = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.364 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] osapi_compute_listen_port      = 8774 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] osapi_compute_unique_server_name_scope =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] osapi_compute_workers          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] password_length                = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] periodic_enable                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] periodic_fuzzy_delay           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] pointer_model                  = usbtablet log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.365 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] preallocate_images             = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] pybasedir                      = /usr/lib/python3.9/site-packages log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ram_allocation_ratio           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.366 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reboot_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reclaim_instance_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] record                         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reimage_timeout_per_gb         = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] report_interval                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rescue_timeout                 = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.367 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reserved_host_cpus             = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reserved_host_disk_mb          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reserved_host_memory_mb        = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] reserved_huge_pages            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] resize_confirm_window          = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] resize_fs_using_block_device   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] resume_guests_state_on_host_boot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.368 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rootwrap_config                = /etc/nova/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rpc_response_timeout           = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] run_external_periodic_tasks    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] running_deleted_instance_action = reap log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] running_deleted_instance_poll_interval = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] running_deleted_instance_timeout = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler_instance_sync_interval = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.369 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_down_time              = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] servicegroup_driver            = db log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] shelved_offload_time           = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] shelved_poll_interval          = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] shutdown_timeout               = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] source_is_ipv6                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ssl_only                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.370 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] state_path                     = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] sync_power_state_interval      = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] sync_power_state_pool_size     = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] tempdir                        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] timeout_nbd                    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.371 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] update_resources_interval      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_cow_images                 = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_rootwrap_daemon            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.372 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vcpu_pin_set                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plugging_is_fatal          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plugging_timeout           = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] virt_mkfs                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] volume_usage_poll_interval     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.373 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] web                            = /usr/share/spice-html5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_concurrency.disable_process_locking = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_concurrency.lock_path     = /var/lib/nova/tmp log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_metrics.metrics_buffer_size = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_metrics.metrics_enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_metrics.metrics_process_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_metrics.metrics_socket_file = /var/tmp/metrics_collector.sock log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.374 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_metrics.metrics_thread_stop_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.auth_strategy              = keystone log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.compute_link_prefix        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.config_drive_skip_versions = 1.0 2007-01-19 2007-03-01 2007-08-29 2007-10-10 2007-12-15 2008-02-01 2008-09-01 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.dhcp_domain                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.enable_instance_password   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.375 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.glance_link_prefix         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.instance_list_cells_batch_fixed_size = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.instance_list_cells_batch_strategy = distributed log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.instance_list_per_project_cells = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.list_records_by_skipping_down_cells = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.local_metadata_per_cell    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.max_limit                  = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.376 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.metadata_cache_expiration  = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.neutron_default_tenant_id  = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.use_forwarded_for          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.use_neutron_default_nets   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_dynamic_connect_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_dynamic_failure_fatal = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.377 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_dynamic_read_timeout = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_dynamic_ssl_certfile =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_dynamic_targets = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_jsonfile_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api.vendordata_providers       = ['StaticJSON'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.backend                  = oslo_cache.dict log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.backend_argument         = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.378 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.config_prefix            = cache.oslo log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.dead_timeout             = 60.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.debug_cache_backend      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.enable_retry_client      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.enable_socket_keepalive  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.enabled                  = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.379 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.expiration_time          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.380 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.hashclient_retry_attempts = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.380 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.hashclient_retry_delay   = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.380 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_dead_retry      = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.380 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_password        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.380 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_pool_connection_get_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_pool_flush_on_reconnect = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_pool_maxsize    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_pool_unused_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_sasl_enabled    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_servers         = ['localhost:11211'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.381 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_socket_timeout  = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.memcache_username        =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.proxies                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.retry_attempts           = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.retry_delay              = 0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.socket_keepalive_count   = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.382 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.socket_keepalive_idle    = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.socket_keepalive_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.tls_allowed_ciphers      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.tls_cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.tls_certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.tls_enabled              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cache.tls_keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.383 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.384 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.auth_type               = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.384 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.384 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.catalog_info            = volumev3:cinderv3:internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.384 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.384 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.385 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.cross_az_attach         = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.385 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.385 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.endpoint_template       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.385 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.http_retries            = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.385 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.os_region_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cinder.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.consecutive_build_service_disable_threshold = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.386 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.cpu_dedicated_set      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.387 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.cpu_shared_set         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.387 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.image_type_exclude_list = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.387 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.live_migration_wait_for_vif_plug = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.387 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.max_concurrent_disk_ops = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.387 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.max_disk_devices_to_attach = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.packing_host_numa_cells_allocation_strategy = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.provider_config_location = /etc/nova/provider_config/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.resource_provider_association_refresh = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.shutdown_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] compute.vmdk_allowed_types     = ['streamOptimized', 'monolithicSparse'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.388 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] conductor.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] console.allowed_origins        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] console.ssl_ciphers            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] console.ssl_minimum_version    = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] consoleauth.token_ttl          = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.389 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.390 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.service_type            = accelerator log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.391 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] cyborg.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.backend               = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.connection            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.connection_debug      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.392 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.connection_trace      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.db_max_retries        = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.db_retry_interval     = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.max_overflow          = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.393 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.max_pool_size         = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.max_retries           = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.mysql_enable_ndb      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.mysql_sql_mode        = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.pool_timeout          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.retry_interval        = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.394 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.slave_connection      = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] database.sqlite_synchronous    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.backend           = sqlalchemy log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.connection        = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.connection_debug  = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.connection_parameters =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.connection_recycle_time = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.395 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.connection_trace  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.db_inc_retry_interval = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.db_max_retries    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.db_max_retry_interval = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.db_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.max_overflow      = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.max_pool_size     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.396 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.max_retries       = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.mysql_enable_ndb  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.mysql_sql_mode    = TRADITIONAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.mysql_wsrep_sync_wait = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.pool_timeout      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.retry_interval    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.slave_connection  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.397 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] api_database.sqlite_synchronous = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.398 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] devices.enabled_mdev_types     = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.398 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ephemeral_storage_encryption.cipher = aes-xts-plain64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.398 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ephemeral_storage_encryption.enabled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.398 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ephemeral_storage_encryption.key_size = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.398 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.api_servers             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.399 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.debug                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.default_trusted_certificate_ids = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.enable_certificate_validation = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.enable_rbd_download     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.400 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.num_retries             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.rbd_ceph_conf           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.rbd_connect_timeout     = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.rbd_pool                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.rbd_user                =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.region_name             = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.401 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.service_type            = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.valid_interfaces        = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.402 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.verify_glance_signatures = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] glance.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] guestfs.debug                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.config_drive_cdrom      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.config_drive_inject_password = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.dynamic_memory_ratio    = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.enable_instance_metrics_collection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.enable_remotefx         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.403 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.instances_path_share    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.iscsi_initiator_list    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.limit_cpu_features      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.mounted_disk_query_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.mounted_disk_query_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.power_state_check_timeframe = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.power_state_event_polling_interval = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.404 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.qemu_img_cmd            = qemu-img.exe log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.use_multipath_io        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.volume_attach_retry_count = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.volume_attach_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.vswitch_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] hyperv.wait_soft_reboot_seconds = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.405 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] mks.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] mks.mksproxy_base_url          = http://127.0.0.1:6090/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.manager_interval   = 2400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.precache_concurrency = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.remove_unused_base_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.remove_unused_original_minimum_age_seconds = 86400 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.remove_unused_resized_minimum_age_seconds = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.406 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] image_cache.subdirectory_name  = _base log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.api_max_retries         = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.api_retry_interval      = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.auth_section            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.auth_type               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.cafile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.certfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.407 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.collect_timing          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.connect_retries         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.connect_retry_delay     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.endpoint_override       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.keyfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.max_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.min_version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.408 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.partition_key           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.peer_list               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.region_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.serial_console_state_timeout = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.service_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.service_type            = baremetal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.split_loggers           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.409 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.status_code_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.timeout                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.valid_interfaces        = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ironic.version                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] key_manager.backend            = barbican log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] key_manager.fixed_key          = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.410 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.auth_endpoint         = http://localhost/identity/v3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.barbican_api_version  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.barbican_endpoint     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.barbican_endpoint_type = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.barbican_region_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.411 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.number_of_retries     = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.retry_delay           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.send_service_user_token = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.412 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.verify_ssl            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican.verify_ssl_path       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.cafile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.413 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.keyfile  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] barbican_service_user.timeout  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.414 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.approle_role_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.approle_secret_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.kv_mountpoint            = secret log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.415 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.kv_version               = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.namespace                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.root_token_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.ssl_ca_crt_file          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.416 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.use_ssl                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vault.vault_url                = http://127.0.0.1:8200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.cafile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.certfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.collect_timing        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.connect_retries       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.417 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.connect_retry_delay   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.endpoint_override     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.insecure              = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.keyfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.max_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.min_version           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.region_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.418 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.service_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.service_type          = identity log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.split_loggers         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.status_code_retries   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.timeout               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.valid_interfaces      = ['internal', 'public'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.419 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] keystone.version               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.connection_uri         =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_mode               = custom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_model_extra_flags  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_models             = ['Nehalem'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_power_governor_high = performance log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_power_governor_low = powersave log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.420 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_power_management   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.cpu_power_management_strategy = cpu_state log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.device_detach_attempts = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.device_detach_timeout  = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.disk_cachemodes        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.disk_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.421 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.enabled_perf_events    = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.file_backed_memory     = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.gid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.hw_disk_discard        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.hw_machine_type        = ['x86_64=q35'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_rbd_ceph_conf   =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.422 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_rbd_glance_copy_poll_interval = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_rbd_glance_copy_timeout = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_rbd_glance_store_name =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_rbd_pool        = rbd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_type            = qcow2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.images_volume_group    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.inject_key             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.423 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.inject_partition       = -2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.inject_password        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.iscsi_iface            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.iser_use_multipath     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_bandwidth = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_completion_timeout = 800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_downtime = 500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.424 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_downtime_delay = 75 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_downtime_steps = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_inbound_addr = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_permit_auto_converge = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_permit_post_copy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_scheme  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_timeout_action = force_complete log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.425 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_tunnelled = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.426 187132 WARNING oslo_config.cfg [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] Deprecated: Option "live_migration_uri" from group "libvirt" is deprecated for removal (
Dec 11 00:58:17 np0005554845 nova_compute[187128]: live_migration_uri is deprecated for removal in favor of two other options that
Dec 11 00:58:17 np0005554845 nova_compute[187128]: allow to change live migration scheme and target URI: ``live_migration_scheme``
Dec 11 00:58:17 np0005554845 nova_compute[187128]: and ``live_migration_inbound_addr`` respectively.
Dec 11 00:58:17 np0005554845 nova_compute[187128]: ).  Its value may be silently ignored in the future.#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.426 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_uri     = qemu+tls://%s/system log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.426 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.live_migration_with_native_tls = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.426 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.max_queues             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.426 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.mem_stats_period_seconds = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.427 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.nfs_mount_options      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.427 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.nfs_mount_point_base   = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.427 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_aoe_discover_tries = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.427 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_iser_scan_tries    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.427 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_memory_encrypted_guests = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_nvme_discover_tries = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_pcie_ports         = 24 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.num_volume_scan_tries  = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.pmem_namespaces        = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.quobyte_client_cfg     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.428 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.quobyte_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rbd_connect_timeout    = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rbd_destroy_volume_retries = 12 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rbd_destroy_volume_retry_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rbd_secret_uuid        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rbd_user               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.429 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.realtime_scheduler_priority = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.remote_filesystem_transport = ssh log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rescue_image_id        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rescue_kernel_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rescue_ramdisk_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rng_dev_path           = /dev/urandom log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.rx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.430 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.smbfs_mount_options    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.smbfs_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.snapshot_compression   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.snapshot_image_format  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.snapshots_directory    = /var/lib/nova/instances/snapshots log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.sparse_logical_volumes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.swtpm_enabled          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.431 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.swtpm_group            = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.swtpm_user             = tss log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.sysinfo_serial         = unique log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.tx_queue_size          = 512 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.uid_maps               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.use_virtio_for_bridges = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.432 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.virt_type              = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.433 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.volume_clear           = zero log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.433 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.volume_clear_size      = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.433 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.volume_use_multipath   = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.433 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_cache_path   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.433 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_log_path     = /var/log/vstorage/%(cluster_name)s/nova.log.gz log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_mount_group  = qemu log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_mount_opts   = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_mount_perms  = 0770 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_mount_point_base = /var/lib/nova/mnt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.vzstorage_mount_user   = stack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.434 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] libvirt.wait_soft_reboot_seconds = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.auth_section           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.auth_type              = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.435 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.connect_retries        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.connect_retry_delay    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.default_floating_pool  = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.endpoint_override      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.extension_sync_interval = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.http_retries           = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.436 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.max_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.metadata_proxy_shared_secret = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.min_version            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.ovs_bridge             = br-int log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.physnets               = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.region_name            = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.437 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.service_metadata_proxy = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.service_name           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.service_type           = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.status_code_retries    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.438 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.valid_interfaces       = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] neutron.version                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] notifications.bdms_in_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] notifications.default_level    = INFO log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] notifications.notification_format = unversioned log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] notifications.notify_on_state_change = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.439 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] notifications.versioned_notifications_topics = ['versioned_notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] pci.alias                      = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] pci.device_spec                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] pci.report_in_placement        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.auth_section         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.auth_type            = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.auth_url             = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.440 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.cafile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.certfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.collect_timing       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.connect_retries      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.connect_retry_delay  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.default_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.441 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.default_domain_name  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.domain_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.domain_name          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.endpoint_override    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.insecure             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.keyfile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.max_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.min_version          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.442 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.password             = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.project_domain_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.project_domain_name  = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.project_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.project_name         = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.region_name          = regionOne log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.443 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.service_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.service_type         = placement log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.split_loggers        = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.status_code_retries  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.system_scope         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.444 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.timeout              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.445 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.trust_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.445 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.user_domain_id       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.445 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.user_domain_name     = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.445 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.user_id              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.445 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.username             = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.valid_interfaces     = ['internal'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] placement.version              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.cores                    = 20 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.count_usage_from_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.driver                   = nova.quota.DbQuotaDriver log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.446 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.injected_file_content_bytes = 10240 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.injected_file_path_length = 255 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.injected_files           = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.instances                = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.key_pairs                = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.metadata_items           = 128 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.447 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.ram                      = 51200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.recheck_quota            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.server_group_members     = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] quota.server_groups            = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rdp.enabled                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] rdp.html5_proxy_base_url       = http://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.448 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.discover_hosts_in_cells_interval = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.enable_isolated_aggregate_filtering = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.image_metadata_prefilter = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.limit_tenants_to_placement_aggregate = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.max_attempts         = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.max_placement_results = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.449 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.placement_aggregate_required_for_tenants = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.query_placement_for_availability_zone = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.query_placement_for_image_type_support = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.query_placement_for_routed_network_aggregates = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] scheduler.workers              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_namespace = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.450 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.aggregate_image_properties_isolation_separator = . log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.available_filters = ['nova.scheduler.filters.all_filters'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.build_failure_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.cpu_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.cross_cell_move_weight_multiplier = 1000000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.disk_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.451 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.enabled_filters = ['ComputeFilter', 'ComputeCapabilitiesFilter', 'ImagePropertiesFilter', 'ServerGroupAntiAffinityFilter', 'ServerGroupAffinityFilter'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.452 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.host_subset_size = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.452 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.image_properties_default_architecture = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.452 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.io_ops_weight_multiplier = -1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.452 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.isolated_hosts = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.452 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.isolated_images = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.max_instances_per_host = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.max_io_ops_per_host = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.pci_in_placement = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.pci_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.ram_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.453 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.restrict_isolated_hosts_to_isolated_images = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.shuffle_best_same_weighed_hosts = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.soft_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.soft_anti_affinity_weight_multiplier = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.track_instance_changes = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] filter_scheduler.weight_classes = ['nova.scheduler.weights.all_weighers'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metrics.required               = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.454 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metrics.weight_multiplier      = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.455 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metrics.weight_of_unavailable  = -10000.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.455 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] metrics.weight_setting         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.455 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.base_url        = ws://127.0.0.1:6083/ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.455 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.enabled         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.455 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.port_range      = 10000:20000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.serialproxy_host = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] serial_console.serialproxy_port = 6083 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.auth_section      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.auth_type         = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.456 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.cafile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.certfile          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.collect_timing    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.insecure          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.keyfile           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.send_service_user_token = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.split_loggers     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.457 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] service_user.timeout           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.458 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.agent_enabled            = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.458 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.enabled                  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.458 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.html5proxy_base_url      = http://127.0.0.1:6082/spice_auto.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.458 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.html5proxy_host          = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.458 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.html5proxy_port          = 6082 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.459 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.image_compression        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.459 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.jpeg_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.459 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.playback_compression     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.459 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.server_listen            = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.459 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.server_proxyclient_address = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.streaming_mode           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] spice.zlib_compression         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] upgrade_levels.baseapi         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] upgrade_levels.cert            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] upgrade_levels.compute         = auto log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] upgrade_levels.conductor       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.460 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] upgrade_levels.scheduler       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.auth_type = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.cafile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.certfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.insecure = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.461 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.keyfile = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vendordata_dynamic_auth.timeout = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.cache_prefix            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.cluster_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.462 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.connection_pool_size    = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.console_delay_seconds   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.datastore_regex         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.host_ip                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.463 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.host_username           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.integration_bridge      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.maximum_objects         = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.pbm_default_policy      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.pbm_enabled             = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.pbm_wsdl_location       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.464 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.serial_log_dir          = /opt/vmware/vspc log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.serial_port_proxy_uri   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.serial_port_service_uri = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.use_linked_clone        = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.vnc_keymap              = en-us log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.vnc_port                = 5900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.465 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vmware.vnc_port_total          = 10000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.466 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.auth_schemes               = ['none'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.466 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.enabled                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.466 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.novncproxy_base_url        = https://nova-novncproxy-cell1-public-openstack.apps-crc.testing/vnc_lite.html log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.466 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.novncproxy_host            = 0.0.0.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.466 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.novncproxy_port            = 6080 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.server_listen              = ::0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.server_proxyclient_address = 192.168.122.102 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.vencrypt_ca_certs          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.vencrypt_client_cert       = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vnc.vencrypt_client_key        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.467 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_compute_service_check_for_ffu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_deep_image_inspection = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_fallback_pcpu_query = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_group_policy_check_upcall = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_libvirt_livesnapshot = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.disable_rootwrap   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.enable_numa_live_migration = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.468 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.enable_qemu_monitor_announce_self = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.ensure_libvirt_rbd_instance_dir_cleanup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.handle_virt_lifecycle_events = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.libvirt_disable_apic = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.never_download_image_if_on_rbd = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.qemu_monitor_announce_self_count = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.469 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.qemu_monitor_announce_self_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.reserve_disk_resource_for_image_cache = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.skip_cpu_compare_at_startup = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.skip_cpu_compare_on_dest = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.skip_hypervisor_version_check_on_lm = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.skip_reserve_in_use_ironic_nodes = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.unified_limits_count_pcpu_as_vcpu = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.470 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] workarounds.wait_for_vif_plugged_event_during_hard_reboot = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.471 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.api_paste_config          = api-paste.ini log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.471 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.client_socket_timeout     = 900 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.471 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.default_pool_size         = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.471 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.keep_alive                = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.472 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.max_header_line           = 16384 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.472 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.secure_proxy_ssl_header   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.472 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.ssl_ca_file               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.472 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.ssl_cert_file             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.473 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.ssl_key_file              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.473 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.tcp_keepidle              = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.473 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] wsgi.wsgi_log_format           = %(client_ip)s "%(request_line)s" status: %(status_code)s len: %(body_length)s time: %(wall_seconds).7f log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.474 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] zvm.ca_file                    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.474 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] zvm.cloud_connector_url        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.474 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] zvm.image_tmp_path             = /var/lib/nova/images log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.475 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] zvm.reachable_timeout          = 300 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.475 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.enforce_new_defaults = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.476 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.enforce_scope      = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.476 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.policy_default_rule = default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.476 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.policy_dirs        = ['policy.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.476 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.policy_file        = policy.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.477 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.remote_content_type = application/x-www-form-urlencoded log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.477 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.remote_ssl_ca_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.477 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.remote_ssl_client_crt_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.478 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.remote_ssl_client_key_file = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.478 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_policy.remote_ssl_verify_server_crt = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.478 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_versionedobjects.fatal_exception_format_errors = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.478 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_middleware.http_basic_auth_user_file = /etc/htpasswd log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.479 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] remote_debug.host              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.479 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] remote_debug.port              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.479 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.479 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.amqp_durable_queues = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.479 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.480 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.480 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.480 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.480 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.480 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.481 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.481 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.481 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.481 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.481 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.482 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.482 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.482 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.482 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.482 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_quorum_queue = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.483 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.484 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.485 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.485 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.485 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.485 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.485 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.auth_section        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.486 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.auth_type           = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.486 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.auth_url            = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.486 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.cafile              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.486 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.certfile            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.486 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.collect_timing      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.connect_retries     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.connect_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.default_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.domain_id           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.domain_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.487 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.endpoint_id         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.endpoint_override   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.insecure            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.keyfile             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.max_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.min_version         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.password            = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.488 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.project_domain_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.489 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.project_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.489 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.project_id          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.489 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.project_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.489 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.region_name         = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.489 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.service_name        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.service_type        = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.split_loggers       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.status_code_retries = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.status_code_retry_delay = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.system_scope        = all log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.490 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.timeout             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.trust_id            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.user_domain_id      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.user_domain_name    = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.user_id             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.username            = nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.valid_interfaces    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_limit.version             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.491 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_reports.file_event_handler = /var/lib/nova log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_reports.file_event_handler_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] oslo_reports.log_dir           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.capabilities = [12] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.group = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.492 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_linux_bridge_privileged.user = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.capabilities = [12, 1] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.group  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.logger_name = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.493 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] vif_plug_ovs_privileged.user   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.flat_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.forward_bridge_interface = ['all'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.iptables_bottom_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.iptables_drop_action = DROP log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.iptables_top_regex =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.494 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.network_device_mtu = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.use_ipv6   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_linux_bridge.vlan_interface = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.isolate_vif         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.network_device_mtu  = 1500 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.ovs_vsctl_timeout   = 120 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.ovsdb_connection    = tcp:127.0.0.1:6640 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.495 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.ovsdb_interface     = native log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_vif_ovs.per_port_bridge     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_brick.lock_path             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_brick.wait_mpath_device_attempts = 4 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] os_brick.wait_mpath_device_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.capabilities   = [21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.group          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.helper_command = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.496 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.logger_name    = os_brick.privileged log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] privsep_osbrick.user           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.capabilities    = [0, 1, 2, 3, 12, 21] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.group           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.helper_command  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.logger_name     = oslo_privsep.daemon log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.497 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.thread_pool_size = 8 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.498 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] nova_sys_admin.user            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.498 187132 DEBUG oslo_service.service [None req-353fbfd7-40f5-4a35-8b7e-8b4dc4bec120 - - - - - -] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.498 187132 INFO nova.service [-] Starting compute node (version 27.5.2-0.20250829104910.6f8decf.el9)#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.522 187132 INFO nova.virt.node [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Determined node identity eece7817-9d4f-4ebe-96c8-a659f76170f9 from /var/lib/nova/compute_id#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.522 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Starting native event thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:492#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.523 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Starting green dispatch thread _init_events /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:498#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.523 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Starting connection event dispatch thread initialize /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:620#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.524 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Connecting to libvirt: qemu:///system _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:503#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.536 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Registering for lifecycle events <nova.virt.libvirt.host.Host object at 0x7f733903e550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:509#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.539 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Registering for connection events: <nova.virt.libvirt.host.Host object at 0x7f733903e550> _get_new_connection /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:530#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.541 187132 INFO nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Connection event '1' reason 'None'#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.547 187132 INFO nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt host capabilities <capabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <host>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <uuid>a858e7f7-0b50-46e3-b377-83982336f3bb</uuid>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <arch>x86_64</arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model>EPYC-Rome-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <vendor>AMD</vendor>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <microcode version='16777317'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <signature family='23' model='49' stepping='0'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <topology sockets='8' dies='1' clusters='1' cores='1' threads='1'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <maxphysaddr mode='emulate' bits='40'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='x2apic'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='tsc-deadline'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='osxsave'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='hypervisor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='tsc_adjust'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='spec-ctrl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='stibp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='arch-capabilities'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='cmp_legacy'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='topoext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='virt-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='lbrv'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='tsc-scale'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='vmcb-clean'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='pause-filter'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='pfthreshold'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='svme-addr-chk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='rdctl-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='skip-l1dfl-vmentry'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='mds-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature name='pschange-mc-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <pages unit='KiB' size='4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <pages unit='KiB' size='2048'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <pages unit='KiB' size='1048576'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <power_management>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <suspend_mem/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <suspend_disk/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <suspend_hybrid/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </power_management>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <iommu support='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <migration_features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <live/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <uri_transports>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <uri_transport>tcp</uri_transport>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <uri_transport>rdma</uri_transport>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </uri_transports>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </migration_features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <topology>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <cells num='1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <cell id='0'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <memory unit='KiB'>7864308</memory>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <pages unit='KiB' size='4'>1966077</pages>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <pages unit='KiB' size='2048'>0</pages>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <pages unit='KiB' size='1048576'>0</pages>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <distances>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <sibling id='0' value='10'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          </distances>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          <cpus num='8'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='0' socket_id='0' die_id='0' cluster_id='65535' core_id='0' siblings='0'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='1' socket_id='1' die_id='1' cluster_id='65535' core_id='0' siblings='1'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='2' socket_id='2' die_id='2' cluster_id='65535' core_id='0' siblings='2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='3' socket_id='3' die_id='3' cluster_id='65535' core_id='0' siblings='3'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='4' socket_id='4' die_id='4' cluster_id='65535' core_id='0' siblings='4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='5' socket_id='5' die_id='5' cluster_id='65535' core_id='0' siblings='5'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='6' socket_id='6' die_id='6' cluster_id='65535' core_id='0' siblings='6'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:            <cpu id='7' socket_id='7' die_id='7' cluster_id='65535' core_id='0' siblings='7'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:          </cpus>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        </cell>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </cells>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </topology>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <cache>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='0' level='2' type='both' size='512' unit='KiB' cpus='0'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='1' level='2' type='both' size='512' unit='KiB' cpus='1'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='2' level='2' type='both' size='512' unit='KiB' cpus='2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='3' level='2' type='both' size='512' unit='KiB' cpus='3'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='4' level='2' type='both' size='512' unit='KiB' cpus='4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='5' level='2' type='both' size='512' unit='KiB' cpus='5'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='6' level='2' type='both' size='512' unit='KiB' cpus='6'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='7' level='2' type='both' size='512' unit='KiB' cpus='7'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='0' level='3' type='both' size='16' unit='MiB' cpus='0'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='1' level='3' type='both' size='16' unit='MiB' cpus='1'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='2' level='3' type='both' size='16' unit='MiB' cpus='2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='3' level='3' type='both' size='16' unit='MiB' cpus='3'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='4' level='3' type='both' size='16' unit='MiB' cpus='4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='5' level='3' type='both' size='16' unit='MiB' cpus='5'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='6' level='3' type='both' size='16' unit='MiB' cpus='6'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <bank id='7' level='3' type='both' size='16' unit='MiB' cpus='7'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </cache>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <secmodel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model>selinux</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <doi>0</doi>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <baselabel type='kvm'>system_u:system_r:svirt_t:s0</baselabel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <baselabel type='qemu'>system_u:system_r:svirt_tcg_t:s0</baselabel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </secmodel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <secmodel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model>dac</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <doi>0</doi>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <baselabel type='kvm'>+107:+107</baselabel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <baselabel type='qemu'>+107:+107</baselabel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </secmodel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </host>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <guest>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <os_type>hvm</os_type>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <arch name='i686'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <wordsize>32</wordsize>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <domain type='qemu'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <domain type='kvm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <pae/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <nonpae/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <acpi default='on' toggle='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <apic default='on' toggle='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <cpuselection/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <deviceboot/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <disksnapshot default='on' toggle='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <externalSnapshot/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </guest>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <guest>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <os_type>hvm</os_type>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <arch name='x86_64'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <wordsize>64</wordsize>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <emulator>/usr/libexec/qemu-kvm</emulator>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='240' deprecated='yes'>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine canonical='pc-i440fx-rhel7.6.0' maxCpus='240' deprecated='yes'>pc</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='4096'>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine canonical='pc-q35-rhel9.8.0' maxCpus='4096'>q35</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='4096'>pc-q35-rhel9.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.4.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.5.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.3.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.4.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.2.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.2.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710'>pc-q35-rhel9.0.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.0.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <machine maxCpus='710' deprecated='yes'>pc-q35-rhel8.1.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <domain type='qemu'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <domain type='kvm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <acpi default='on' toggle='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <apic default='on' toggle='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <cpuselection/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <deviceboot/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <disksnapshot default='on' toggle='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <externalSnapshot/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </guest>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </capabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: #033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.553 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Getting domain capabilities for i686 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.555 187132 DEBUG nova.virt.libvirt.volume.mount [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Initialising _HostMountState generation 0 host_up /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/mount.py:130#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.557 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=pc:
Dec 11 00:58:17 np0005554845 nova_compute[187128]: <domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <domain>kvm</domain>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <arch>i686</arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <vcpu max='240'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <iothreads supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <os supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='firmware'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <loader supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>rom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pflash</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='readonly'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>yes</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='secure'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </loader>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </os>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='maximumMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <vendor>AMD</vendor>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='succor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='custom' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-128'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-256'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-512'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <memoryBacking supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='sourceType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>anonymous</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>memfd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </memoryBacking>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <disk supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='diskDevice'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>disk</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cdrom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>floppy</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>lun</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ide</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>fdc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>sata</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </disk>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <graphics supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vnc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egl-headless</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </graphics>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <video supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='modelType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vga</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cirrus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>none</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>bochs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ramfb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </video>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hostdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='mode'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>subsystem</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='startupPolicy'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>mandatory</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>requisite</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>optional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='subsysType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pci</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='capsType'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='pciBackend'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hostdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <rng supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>random</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </rng>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <filesystem supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='driverType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>path</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>handle</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtiofs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </filesystem>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <tpm supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-tis</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-crb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emulator</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>external</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendVersion'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>2.0</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </tpm>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <redirdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </redirdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <channel supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </channel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <crypto supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </crypto>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <interface supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>passt</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </interface>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <panic supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>isa</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>hyperv</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </panic>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <console supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>null</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dev</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pipe</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stdio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>udp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tcp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu-vdagent</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </console>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <gic supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <genid supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backup supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <async-teardown supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <ps2 supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sev supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sgx supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hyperv supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='features'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>relaxed</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vapic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>spinlocks</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vpindex</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>runtime</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>synic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stimer</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reset</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vendor_id</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>frequencies</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reenlightenment</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tlbflush</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ipi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>avic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emsr_bitmap</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>xmm_input</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hyperv>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <launchSecurity supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='sectype'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tdx</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </launchSecurity>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.562 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt host hypervisor capabilities for arch=i686 and machine_type=q35:
Dec 11 00:58:17 np0005554845 nova_compute[187128]: <domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <domain>kvm</domain>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <arch>i686</arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <vcpu max='4096'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <iothreads supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <os supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='firmware'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <loader supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>rom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pflash</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='readonly'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>yes</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='secure'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </loader>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </os>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='maximumMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <vendor>AMD</vendor>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='succor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='custom' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-128'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-256'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-512'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <memoryBacking supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='sourceType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>anonymous</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>memfd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </memoryBacking>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <disk supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='diskDevice'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>disk</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cdrom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>floppy</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>lun</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>fdc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>sata</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </disk>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <graphics supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vnc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egl-headless</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </graphics>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <video supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='modelType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vga</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cirrus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>none</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>bochs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ramfb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </video>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hostdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='mode'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>subsystem</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='startupPolicy'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>mandatory</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>requisite</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>optional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='subsysType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pci</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='capsType'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='pciBackend'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hostdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <rng supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>random</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </rng>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <filesystem supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='driverType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>path</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>handle</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtiofs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </filesystem>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <tpm supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-tis</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-crb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emulator</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>external</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendVersion'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>2.0</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </tpm>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <redirdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </redirdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <channel supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </channel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <crypto supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </crypto>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <interface supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>passt</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </interface>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <panic supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>isa</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>hyperv</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </panic>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <console supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>null</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dev</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pipe</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stdio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>udp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tcp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu-vdagent</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </console>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <gic supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <genid supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backup supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <async-teardown supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <ps2 supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sev supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sgx supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hyperv supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='features'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>relaxed</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vapic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>spinlocks</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vpindex</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>runtime</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>synic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stimer</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reset</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vendor_id</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>frequencies</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reenlightenment</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tlbflush</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ipi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>avic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emsr_bitmap</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>xmm_input</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hyperv>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <launchSecurity supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='sectype'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tdx</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </launchSecurity>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.592 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Getting domain capabilities for x86_64 via machine types: {'pc', 'q35'} _get_machine_types /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:952#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.598 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=pc:
Dec 11 00:58:17 np0005554845 nova_compute[187128]: <domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <domain>kvm</domain>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <machine>pc-i440fx-rhel7.6.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <arch>x86_64</arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <vcpu max='240'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <iothreads supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <os supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='firmware'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <loader supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/OVMF/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>rom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pflash</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='readonly'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>yes</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='secure'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </loader>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </os>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='maximumMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <vendor>AMD</vendor>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='succor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='custom' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-128'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-256'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-512'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <memoryBacking supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='sourceType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>anonymous</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>memfd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </memoryBacking>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <disk supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='diskDevice'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>disk</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cdrom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>floppy</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>lun</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ide</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>fdc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>sata</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </disk>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <graphics supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vnc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egl-headless</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </graphics>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <video supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='modelType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vga</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cirrus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>none</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>bochs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ramfb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </video>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hostdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='mode'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>subsystem</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='startupPolicy'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>mandatory</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>requisite</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>optional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='subsysType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pci</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='capsType'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='pciBackend'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hostdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <rng supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>random</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </rng>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <filesystem supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='driverType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>path</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>handle</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtiofs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </filesystem>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <tpm supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-tis</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-crb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emulator</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>external</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendVersion'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>2.0</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </tpm>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <redirdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </redirdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <channel supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </channel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <crypto supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </crypto>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <interface supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>passt</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </interface>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <panic supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>isa</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>hyperv</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </panic>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <console supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>null</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dev</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pipe</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stdio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>udp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tcp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu-vdagent</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </console>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <gic supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <genid supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backup supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <async-teardown supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <ps2 supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sev supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sgx supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hyperv supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='features'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>relaxed</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vapic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>spinlocks</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vpindex</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>runtime</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>synic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stimer</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reset</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vendor_id</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>frequencies</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reenlightenment</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tlbflush</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ipi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>avic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emsr_bitmap</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>xmm_input</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hyperv>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <launchSecurity supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='sectype'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tdx</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </launchSecurity>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.654 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt host hypervisor capabilities for arch=x86_64 and machine_type=q35:
Dec 11 00:58:17 np0005554845 nova_compute[187128]: <domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <path>/usr/libexec/qemu-kvm</path>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <domain>kvm</domain>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <machine>pc-q35-rhel9.8.0</machine>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <arch>x86_64</arch>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <vcpu max='4096'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <iothreads supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <os supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='firmware'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>efi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <loader supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.secboot.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/edk2/ovmf/OVMF_CODE.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/edk2/ovmf/OVMF.amdsev.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>/usr/share/edk2/ovmf/OVMF.inteltdx.secboot.fd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>rom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pflash</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='readonly'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>yes</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='secure'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>yes</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>no</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </loader>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </os>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-passthrough' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='hostPassthroughMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='maximum' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='maximumMigratable'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>on</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>off</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='host-model' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model fallback='forbid'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <vendor>AMD</vendor>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <maxphysaddr mode='passthrough' limit='40'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='x2apic'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-deadline'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='hypervisor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc_adjust'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='spec-ctrl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='stibp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='cmp_legacy'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='overflow-recov'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='succor'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='amd-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='virt-ssbd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lbrv'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='tsc-scale'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='vmcb-clean'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='flushbyasid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pause-filter'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='pfthreshold'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='svme-addr-chk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='require' name='lfence-always-serializing'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <feature policy='disable' name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <mode name='custom' supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='486-v1'>486</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>486-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v1'>Broadwell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v3'>Broadwell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v2'>Broadwell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Broadwell-v4'>Broadwell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Broadwell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Broadwell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v1'>Cascadelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cascadelake-Server-v3'>Cascadelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cascadelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cascadelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Conroe-v1'>Conroe</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Conroe-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Cooperlake-v1'>Cooperlake</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Cooperlake-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Cooperlake-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Denverton-v1'>Denverton</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Denverton-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Denverton-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon' canonical='Dhyana-v1'>Dhyana</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Hygon'>Dhyana-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Hygon'>Dhyana-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Dhyana-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v1'>EPYC</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Genoa-v1'>EPYC-Genoa</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Genoa-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Genoa-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='auto-ibrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD' canonical='EPYC-v2'>EPYC-IBPB</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Milan-v1'>EPYC-Milan</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Milan-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Milan-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amd-psfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='no-nested-data-bp'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='null-sel-clr-base'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='stibp-always-on'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='EPYC-Rome-v1'>EPYC-Rome</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-Rome-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-Rome-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-Rome-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='AMD'>EPYC-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>EPYC-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='EPYC-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='GraniteRapids-v1'>GraniteRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>GraniteRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='GraniteRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-128'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-256'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx10-512'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='prefetchiti'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v1'>Haswell</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v3'>Haswell-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v2'>Haswell-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Haswell-v4'>Haswell-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Haswell-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Haswell-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v1'>Icelake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Icelake-Server-v2'>Icelake-Server-noTSX</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-noTSX'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v6</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v6'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Icelake-Server-v7</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Icelake-Server-v7'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v1'>IvyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='IvyBridge-v2'>IvyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>IvyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='IvyBridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='KnightsMill-v1'>KnightsMill</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>KnightsMill-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='KnightsMill-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4fmaps'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-4vnniw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512er'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512pf'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v1'>Nehalem</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Nehalem-v2'>Nehalem-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Nehalem-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G1-v1'>Opteron_G1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G1-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G2-v1'>Opteron_G2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD' canonical='Opteron_G3-v1'>Opteron_G3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='AMD'>Opteron_G3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G4-v1'>Opteron_G4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G4-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G4-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD' canonical='Opteron_G5-v1'>Opteron_G5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='AMD'>Opteron_G5-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Opteron_G5-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fma4'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tbm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xop'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel' canonical='Penryn-v1'>Penryn</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='Intel'>Penryn-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v1'>SandyBridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='SandyBridge-v2'>SandyBridge-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>SandyBridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SapphireRapids-v1'>SapphireRapids</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SapphireRapids-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SapphireRapids-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='amx-tile'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-bf16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-fp16'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512-vpopcntdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bitalg'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vbmi2'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrc'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fzrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='la57'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='taa-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='tsx-ldtrk'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xfd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='SierraForest-v1'>SierraForest</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>SierraForest-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='SierraForest-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ifma'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-ne-convert'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx-vnni-int8'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='bus-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cmpccxadd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fbsdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='fsrs'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ibrs-all'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mcdt-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pbrsb-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='psdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='sbdr-ssdp-no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='serialize'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vaes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='vpclmulqdq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v1'>Skylake-Client</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v2'>Skylake-Client-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Client-v3'>Skylake-Client-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Client-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Client-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v1'>Skylake-Server</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v2'>Skylake-Server-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Skylake-Server-v3'>Skylake-Server-noTSX-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-noTSX-IBRS'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='hle'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='rtm'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Skylake-Server-v5</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Skylake-Server-v5'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512bw'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512cd'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512dq'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512f'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='avx512vl'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='invpcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pcid'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='pku'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel' canonical='Snowridge-v1'>Snowridge</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='mpx'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v2'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v3'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='core-capability'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='split-lock-detect'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' vendor='Intel'>Snowridge-v4</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='Snowridge-v4'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='cldemote'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='erms'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='gfni'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdir64b'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='movdiri'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='xsaves'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v1'>Westmere</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel' canonical='Westmere-v2'>Westmere-IBRS</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' vendor='Intel'>Westmere-v2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='athlon-v1'>athlon</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>athlon-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='athlon-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='core2duo-v1'>core2duo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>core2duo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='core2duo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='coreduo-v1'>coreduo</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>coreduo-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='coreduo-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm32-v1'>kvm32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='kvm64-v1'>kvm64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>kvm64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel' canonical='n270-v1'>n270</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='Intel'>n270-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='n270-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='ss'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium-v1'>pentium</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium2-v1'>pentium2</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium2-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='pentium3-v1'>pentium3</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>pentium3-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD' canonical='phenom-v1'>phenom</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='no' deprecated='yes' vendor='AMD'>phenom-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <blockers model='phenom-v1'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnow'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <feature name='3dnowext'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </blockers>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu32-v1'>qemu32</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu32-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown' canonical='qemu64-v1'>qemu64</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <model usable='yes' deprecated='yes' vendor='unknown'>qemu64-v1</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </mode>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <memoryBacking supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <enum name='sourceType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>anonymous</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <value>memfd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </memoryBacking>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <disk supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='diskDevice'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>disk</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cdrom</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>floppy</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>lun</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>fdc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>sata</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </disk>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <graphics supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vnc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egl-headless</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </graphics>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <video supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='modelType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vga</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>cirrus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>none</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>bochs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ramfb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </video>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hostdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='mode'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>subsystem</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='startupPolicy'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>mandatory</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>requisite</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>optional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='subsysType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pci</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>scsi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='capsType'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='pciBackend'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hostdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <rng supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtio-non-transitional</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>random</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>egd</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </rng>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <filesystem supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='driverType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>path</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>handle</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>virtiofs</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </filesystem>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <tpm supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-tis</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tpm-crb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emulator</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>external</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendVersion'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>2.0</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </tpm>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <redirdev supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='bus'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>usb</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </redirdev>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <channel supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </channel>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <crypto supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendModel'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>builtin</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </crypto>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <interface supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='backendType'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>default</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>passt</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </interface>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <panic supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='model'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>isa</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>hyperv</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </panic>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <console supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='type'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>null</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vc</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pty</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dev</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>file</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>pipe</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stdio</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>udp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tcp</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>unix</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>qemu-vdagent</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>dbus</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </console>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </devices>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <gic supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <vmcoreinfo supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <genid supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backingStoreInput supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <backup supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <async-teardown supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <ps2 supported='yes'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sev supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <sgx supported='no'/>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <hyperv supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='features'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>relaxed</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vapic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>spinlocks</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vpindex</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>runtime</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>synic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>stimer</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reset</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>vendor_id</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>frequencies</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>reenlightenment</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tlbflush</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>ipi</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>avic</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>emsr_bitmap</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>xmm_input</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <spinlocks>4095</spinlocks>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <stimer_direct>on</stimer_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_direct>on</tlbflush_direct>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <tlbflush_extended>on</tlbflush_extended>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <vendor_id>Linux KVM Hv</vendor_id>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </defaults>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </hyperv>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    <launchSecurity supported='yes'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      <enum name='sectype'>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:        <value>tdx</value>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:      </enum>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:    </launchSecurity>
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  </features>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </domainCapabilities>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: _get_domain_capabilities /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1037#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.717 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.717 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.718 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Checking secure boot support for host arch (x86_64) supports_secure_boot /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1782#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.718 187132 INFO nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Secure Boot support detected#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.719 187132 INFO nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.719 187132 INFO nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] The live_migration_permit_post_copy is set to True and post copy live migration is available so auto-converge will not be in use.#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.727 187132 DEBUG nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] cpu compare xml: <cpu match="exact">
Dec 11 00:58:17 np0005554845 nova_compute[187128]:  <model>Nehalem</model>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: </cpu>
Dec 11 00:58:17 np0005554845 nova_compute[187128]: _compare_cpu /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10019#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.729 187132 DEBUG nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Enabling emulated TPM support _check_vtpm_support /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:1097#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.752 187132 INFO nova.virt.node [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Determined node identity eece7817-9d4f-4ebe-96c8-a659f76170f9 from /var/lib/nova/compute_id#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.796 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Verified node eece7817-9d4f-4ebe-96c8-a659f76170f9 matches my host compute-2.ctlplane.example.com _check_for_host_rename /usr/lib/python3.9/site-packages/nova/compute/manager.py:1568#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.862 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Looking for unclaimed instances stuck in BUILDING status for nodes managed by this host#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.990 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.991 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.991 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:58:17 np0005554845 nova_compute[187128]: 2025-12-11 05:58:17.992 187132 DEBUG nova.compute.resource_tracker [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.169 187132 WARNING nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.171 187132 DEBUG nova.compute.resource_tracker [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6125MB free_disk=73.53116607666016GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.171 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.172 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.301 187132 DEBUG nova.compute.resource_tracker [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.302 187132 DEBUG nova.compute.resource_tracker [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.468 187132 DEBUG nova.scheduler.client.report [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.505 187132 DEBUG nova.scheduler.client.report [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.506 187132 DEBUG nova.compute.provider_tree [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.519 187132 DEBUG nova.scheduler.client.report [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.540 187132 DEBUG nova.scheduler.client.report [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.562 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] /sys/module/kvm_amd/parameters/sev contains [N
Dec 11 00:58:18 np0005554845 nova_compute[187128]: ] _kernel_supports_amd_sev /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1803#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.563 187132 INFO nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] kernel doesn't support AMD SEV#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.564 187132 DEBUG nova.compute.provider_tree [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.564 187132 DEBUG nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.567 187132 DEBUG nova.virt.libvirt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Libvirt baseline CPU <cpu>
Dec 11 00:58:18 np0005554845 nova_compute[187128]:  <arch>x86_64</arch>
Dec 11 00:58:18 np0005554845 nova_compute[187128]:  <model>Nehalem</model>
Dec 11 00:58:18 np0005554845 nova_compute[187128]:  <vendor>AMD</vendor>
Dec 11 00:58:18 np0005554845 nova_compute[187128]:  <topology sockets="8" cores="1" threads="1"/>
Dec 11 00:58:18 np0005554845 nova_compute[187128]: </cpu>
Dec 11 00:58:18 np0005554845 nova_compute[187128]: _get_guest_baseline_cpu_features /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12537#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.619 187132 DEBUG nova.scheduler.client.report [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.644 187132 DEBUG nova.compute.resource_tracker [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.644 187132 DEBUG oslo_concurrency.lockutils [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.473s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.645 187132 DEBUG nova.service [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Creating RPC server for service compute start /usr/lib/python3.9/site-packages/nova/service.py:182#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.683 187132 DEBUG nova.service [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Join ServiceGroup membership for this service compute start /usr/lib/python3.9/site-packages/nova/service.py:199#033[00m
Dec 11 00:58:18 np0005554845 nova_compute[187128]: 2025-12-11 05:58:18.683 187132 DEBUG nova.servicegroup.drivers.db [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] DB_Driver: join new ServiceGroup member compute-2.ctlplane.example.com to the compute group, service = <Service: host=compute-2.ctlplane.example.com, binary=nova-compute, manager_class_name=nova.compute.manager.ComputeManager> join /usr/lib/python3.9/site-packages/nova/servicegroup/drivers/db.py:44#033[00m
Dec 11 00:58:22 np0005554845 systemd-logind[789]: New session 27 of user zuul.
Dec 11 00:58:22 np0005554845 systemd[1]: Started Session 27 of User zuul.
Dec 11 00:58:23 np0005554845 python3.9[187579]: ansible-ansible.builtin.setup Invoked with gather_subset=['!all', '!min', 'local'] gather_timeout=10 filter=[] fact_path=/etc/ansible/facts.d
Dec 11 00:58:25 np0005554845 python3.9[187735]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:58:25 np0005554845 systemd[1]: Reloading.
Dec 11 00:58:25 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:58:25 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:58:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:58:26.209 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:58:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:58:26.210 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:58:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:58:26.211 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:58:27 np0005554845 python3.9[187920]: ansible-ansible.builtin.service_facts Invoked
Dec 11 00:58:27 np0005554845 network[187937]: You are using 'network' service provided by 'network-scripts', which are now deprecated.
Dec 11 00:58:27 np0005554845 network[187938]: 'network-scripts' will be removed from distribution in near future.
Dec 11 00:58:27 np0005554845 network[187939]: It is advised to switch to 'NetworkManager' instead for network management.
Dec 11 00:58:30 np0005554845 podman[188012]: 2025-12-11 05:58:30.483185514 +0000 UTC m=+0.077134859 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 00:58:30 np0005554845 podman[188013]: 2025-12-11 05:58:30.525711001 +0000 UTC m=+0.111684950 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:58:32 np0005554845 podman[188227]: 2025-12-11 05:58:32.728822108 +0000 UTC m=+0.055007662 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 11 00:58:32 np0005554845 python3.9[188273]: ansible-ansible.builtin.systemd_service Invoked with enabled=False name=tripleo_ceilometer_agent_compute.service state=stopped daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:58:33 np0005554845 python3.9[188426]: ansible-ansible.builtin.file Invoked with path=/usr/lib/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:34 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:58:34 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 00:58:34 np0005554845 python3.9[188579]: ansible-ansible.builtin.file Invoked with path=/etc/systemd/system/tripleo_ceilometer_agent_compute.service state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:35 np0005554845 python3.9[188731]: ansible-ansible.legacy.command Invoked with _raw_params=if systemctl is-active certmonger.service; then#012  systemctl disable --now certmonger.service#012  test -f /etc/systemd/system/certmonger.service || systemctl mask certmonger.service#012fi#012 _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:58:36 np0005554845 python3.9[188883]: ansible-ansible.builtin.find Invoked with file_type=any hidden=True paths=['/var/lib/certmonger/requests'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 00:58:37 np0005554845 python3.9[189035]: ansible-ansible.builtin.systemd_service Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:58:37 np0005554845 systemd[1]: Reloading.
Dec 11 00:58:37 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:58:37 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:58:39 np0005554845 python3.9[189222]: ansible-ansible.legacy.command Invoked with cmd=/usr/bin/systemctl reset-failed tripleo_ceilometer_agent_compute.service _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True _raw_params=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 00:58:39 np0005554845 python3.9[189375]: ansible-ansible.builtin.file Invoked with group=zuul mode=0750 owner=zuul path=/var/lib/openstack/config/telemetry recurse=True setype=container_file_t state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:58:40 np0005554845 python3.9[189525]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:41 np0005554845 python3.9[189677]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:42 np0005554845 python3.9[189798]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf mode=0644 setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432721.234117-362-32573327576732/.source.conf follow=False _original_basename=ceilometer-host-specific.conf.j2 checksum=d3d36c542f4af449a66988015465dd0bb4b47bb9 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:58:43 np0005554845 python3.9[189950]: ansible-ansible.builtin.group Invoked with name=libvirt state=present force=False system=False local=False non_unique=False gid=None gid_min=None gid_max=None
Dec 11 00:58:44 np0005554845 python3.9[190102]: ansible-ansible.builtin.getent Invoked with database=passwd key=ceilometer fail_key=True service=None split=None
Dec 11 00:58:45 np0005554845 python3.9[190255]: ansible-ansible.builtin.group Invoked with gid=42405 name=ceilometer state=present force=False system=False local=False non_unique=False gid_min=None gid_max=None
Dec 11 00:58:46 np0005554845 python3.9[190413]: ansible-ansible.builtin.user Invoked with comment=ceilometer user group=ceilometer groups=['libvirt'] name=ceilometer shell=/sbin/nologin state=present uid=42405 non_unique=False force=False remove=False create_home=True system=False move_home=False append=False ssh_key_bits=0 ssh_key_type=rsa ssh_key_comment=ansible-generated on compute-2 update_password=always home=None password=NOT_LOGGING_PARAMETER login_class=None password_expire_max=None password_expire_min=None password_expire_warn=None hidden=None seuser=None skeleton=None generate_ssh_key=None ssh_key_file=None ssh_key_passphrase=NOT_LOGGING_PARAMETER expires=None password_lock=None local=None profile=None authorization=None role=None umask=None password_expire_account_disable=None uid_min=None uid_max=None
Dec 11 00:58:48 np0005554845 python3.9[190571]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:48 np0005554845 python3.9[190692]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765432727.6245592-565-213458417256708/.source.conf _original_basename=ceilometer.conf follow=False checksum=f74f01c63e6cdeca5458ef9aff2a1db5d6a4e4b9 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:49 np0005554845 python3.9[190842]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/polling.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:50 np0005554845 python3.9[190963]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/polling.yaml mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765432728.9506278-565-250259409883345/.source.yaml _original_basename=polling.yaml follow=False checksum=6c8680a286285f2e0ef9fa528ca754765e5ed0e5 backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:50 np0005554845 python3.9[191113]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/custom.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:51 np0005554845 python3.9[191234]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/custom.conf mode=0640 remote_src=False src=/home/zuul/.ansible/tmp/ansible-tmp-1765432730.3061502-565-113938015597765/.source.conf _original_basename=custom.conf follow=False checksum=838b8b0a7d7f72e55ab67d39f32e3cb3eca2139b backup=False force=True unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:52 np0005554845 python3.9[191384]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.crt follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:52 np0005554845 python3.9[191536]: ansible-ansible.builtin.stat Invoked with path=/var/lib/openstack/certs/telemetry/default/tls.key follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:58:53 np0005554845 python3.9[191688]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:54 np0005554845 python3.9[191809]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432733.2651484-743-255034147144423/.source.json follow=False _original_basename=ceilometer-agent-compute.json.j2 checksum=264d11e8d3809e7ef745878dce7edd46098e25b2 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:54 np0005554845 python3.9[191959]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:55 np0005554845 python3.9[192035]: ansible-ansible.legacy.file Invoked with mode=420 dest=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf _original_basename=ceilometer-host-specific.conf.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer-host-specific.conf force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:56 np0005554845 python3.9[192185]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:56 np0005554845 python3.9[192306]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_agent_compute.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432735.661064-743-126218451227784/.source.json follow=False _original_basename=ceilometer_agent_compute.json.j2 checksum=17453a32c9d181134878b3e453cb84c3cd9bd67d backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:57 np0005554845 python3.9[192456]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:58 np0005554845 python3.9[192577]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432736.9855003-743-256995475635245/.source.yaml follow=False _original_basename=ceilometer_prom_exporter.yaml.j2 checksum=10157c879411ee6023e506dc85a343cedc52700f backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:58:58 np0005554845 nova_compute[187128]: 2025-12-11 05:58:58.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:58:58 np0005554845 nova_compute[187128]: 2025-12-11 05:58:58.709 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:58:58 np0005554845 python3.9[192727]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/firewall.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:58:59 np0005554845 python3.9[192848]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/firewall.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432738.2716165-743-166992916457101/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:00 np0005554845 python3.9[192998]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:00 np0005554845 podman[193093]: 2025-12-11 05:59:00.685258275 +0000 UTC m=+0.074600386 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 00:59:00 np0005554845 podman[193094]: 2025-12-11 05:59:00.725860828 +0000 UTC m=+0.115315802 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 00:59:00 np0005554845 python3.9[193142]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432739.6297016-743-32539270979956/.source.json follow=False _original_basename=node_exporter.json.j2 checksum=6e4982940d2bfae88404914dfaf72552f6356d81 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:01 np0005554845 python3.9[193311]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:02 np0005554845 python3.9[193432]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/node_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432741.0390093-743-218733152068359/.source.yaml follow=False _original_basename=node_exporter.yaml.j2 checksum=81d906d3e1e8c4f8367276f5d3a67b80ca7e989e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:02 np0005554845 podman[193556]: 2025-12-11 05:59:02.968975009 +0000 UTC m=+0.072651524 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:59:03 np0005554845 python3.9[193597]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:03 np0005554845 python3.9[193724]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432742.579329-743-222358313183045/.source.json follow=False _original_basename=openstack_network_exporter.json.j2 checksum=d474f1e4c3dbd24762592c51cbe5311f0a037273 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:04 np0005554845 python3.9[193874]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:05 np0005554845 python3.9[193995]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432743.891867-743-14138915011556/.source.yaml follow=False _original_basename=openstack_network_exporter.yaml.j2 checksum=2b6bd0891e609bf38a73282f42888052b750bed6 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:05 np0005554845 python3.9[194145]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.json follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:06 np0005554845 python3.9[194266]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.json mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432745.2461016-743-76563287475123/.source.json follow=False _original_basename=podman_exporter.json.j2 checksum=e342121a88f67e2bae7ebc05d1e6d350470198a5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:07 np0005554845 python3.9[194416]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:07 np0005554845 python3.9[194537]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml mode=420 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432746.6466455-743-41492518742182/.source.yaml follow=False _original_basename=podman_exporter.yaml.j2 checksum=7ccb5eca2ff1dc337c3f3ecbbff5245af7149c47 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:08 np0005554845 python3.9[194687]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/node_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:09 np0005554845 python3.9[194763]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/node_exporter.yaml _original_basename=node_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/node_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:10 np0005554845 python3.9[194913]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/podman_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:10 np0005554845 python3.9[194989]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/podman_exporter.yaml _original_basename=podman_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/podman_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:11 np0005554845 python3.9[195139]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:11 np0005554845 python3.9[195215]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml _original_basename=ceilometer_prom_exporter.yaml.j2 recurse=False state=file path=/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:12 np0005554845 python3.9[195367]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.crt recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:13 np0005554845 python3.9[195519]: ansible-ansible.builtin.file Invoked with group=ceilometer mode=0644 owner=ceilometer path=/var/lib/openstack/certs/telemetry/default/tls.key recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False state=None _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:14 np0005554845 python3.9[195671]: ansible-ansible.builtin.file Invoked with group=zuul mode=0755 owner=zuul path=/var/lib/openstack/healthchecks setype=container_file_t state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:15 np0005554845 python3.9[195823]: ansible-ansible.builtin.systemd_service Invoked with enabled=True name=podman.socket state=started daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:59:15 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:15 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:15 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:15 np0005554845 systemd[1]: Listening on Podman API Socket.
Dec 11 00:59:16 np0005554845 python3.9[196013]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.695 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.695 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.695 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.724 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.724 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.725 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.725 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.726 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.726 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.727 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.727 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.727 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.752 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.753 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.753 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.753 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.940 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.942 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6121MB free_disk=73.5316047668457GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.942 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:59:16 np0005554845 nova_compute[187128]: 2025-12-11 05:59:16.942 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.056 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.057 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.095 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.111 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.113 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 00:59:17 np0005554845 nova_compute[187128]: 2025-12-11 05:59:17.114 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.172s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:59:17 np0005554845 python3.9[196136]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432756.14281-1409-43617955500335/.source _original_basename=healthcheck follow=False checksum=ebb343c21fce35a02591a9351660cb7035a47d42 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:17 np0005554845 python3.9[196212]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/ceilometer_agent_compute/healthcheck.future follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:18 np0005554845 python3.9[196335]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/ceilometer_agent_compute/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432756.14281-1409-43617955500335/.source.future _original_basename=healthcheck.future follow=False checksum=d500a98192f4ddd70b4dfdc059e2d81aed36a294 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:19 np0005554845 python3.9[196487]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=ceilometer_agent_compute.json debug=False
Dec 11 00:59:20 np0005554845 python3.9[196639]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:59:21 np0005554845 python3[196791]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=ceilometer_agent_compute.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:59:22 np0005554845 podman[196829]: 2025-12-11 05:59:22.116623094 +0000 UTC m=+0.027962431 image pull 806262ad9f61127734555408f71447afe6ceede79cc666e6f523dacd5edec739 quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified
Dec 11 00:59:22 np0005554845 podman[196829]: 2025-12-11 05:59:22.263567534 +0000 UTC m=+0.174906881 container create f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 00:59:22 np0005554845 python3[196791]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name ceilometer_agent_compute --conmon-pidfile /run/ceilometer_agent_compute.pid --env KOLLA_CONFIG_STRATEGY=COPY_ALWAYS --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck compute --label config_id=edpm --label container_name=ceilometer_agent_compute --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']} --log-driver journald --log-level info --network host --security-opt label:type:ceilometer_polling_t --user ceilometer --volume /var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z --volume /var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z --volume /run/libvirt:/run/libvirt:shared,ro --volume /etc/hosts:/etc/hosts:ro --volume /etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro --volume /etc/localtime:/etc/localtime:ro --volume /etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro --volume /var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z --volume /var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z --volume /dev/log:/dev/log --volume /var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified kolla_start
Dec 11 00:59:23 np0005554845 python3.9[197017]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:59:24 np0005554845 python3.9[197171]: ansible-file Invoked with path=/etc/systemd/system/edpm_ceilometer_agent_compute.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:25 np0005554845 python3.9[197322]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432764.3848212-1600-225718211408393/source dest=/etc/systemd/system/edpm_ceilometer_agent_compute.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:25 np0005554845 python3.9[197398]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:59:25 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:25 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:25 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:59:26.210 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 00:59:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:59:26.212 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 00:59:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 05:59:26.212 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 00:59:26 np0005554845 python3.9[197510]: ansible-systemd Invoked with state=restarted name=edpm_ceilometer_agent_compute.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:59:26 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:26 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:26 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:27 np0005554845 systemd[1]: Starting ceilometer_agent_compute container...
Dec 11 00:59:27 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:27 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:27 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:27 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:27 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:27 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.
Dec 11 00:59:27 np0005554845 podman[197550]: 2025-12-11 05:59:27.241098417 +0000 UTC m=+0.202197471 container init f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + sudo -E kolla_set_configs
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: sudo: unable to send audit message: Operation not permitted
Dec 11 00:59:27 np0005554845 podman[197550]: 2025-12-11 05:59:27.27470046 +0000 UTC m=+0.235799444 container start f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS)
Dec 11 00:59:27 np0005554845 podman[197550]: ceilometer_agent_compute
Dec 11 00:59:27 np0005554845 systemd[1]: Started ceilometer_agent_compute container.
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Validating config file
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Copying service configuration files
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: INFO:__main__:Writing out command to execute
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: ++ cat /run_command
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + ARGS=
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + sudo kolla_copy_cacerts
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: sudo: unable to send audit message: Operation not permitted
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + [[ ! -n '' ]]
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + . kolla_extend_start
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + umask 0022
Dec 11 00:59:27 np0005554845 ceilometer_agent_compute[197565]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 11 00:59:27 np0005554845 podman[197572]: 2025-12-11 05:59:27.415147254 +0000 UTC m=+0.132229662 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202)
Dec 11 00:59:27 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-469e9fcf4d3733c9.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 00:59:27 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-469e9fcf4d3733c9.service: Failed with result 'exit-code'.
Dec 11 00:59:28 np0005554845 python3.9[197746]: ansible-ansible.builtin.systemd Invoked with name=edpm_ceilometer_agent_compute.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.254 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.255 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.256 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.257 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.258 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.258 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.258 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.258 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.258 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.259 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.260 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.260 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.260 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.260 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.260 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.261 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.262 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.263 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.264 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.265 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.266 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.267 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.268 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.269 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.270 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.271 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.272 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.273 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.274 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.275 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.276 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.277 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.278 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.279 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.279 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.279 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.279 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 00:59:28 np0005554845 systemd[1]: Stopping ceilometer_agent_compute container...
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.297 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.300 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.301 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.333 2 INFO cotyledon._service_manager [-] Caught SIGTERM signal, graceful exiting of master process
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.405 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.434 2 DEBUG cotyledon._service_manager [-] Killing services with signal SIGTERM _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:304
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.435 2 DEBUG cotyledon._service_manager [-] Waiting services to terminate _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:308
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.476 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.477 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.478 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.479 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.480 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.481 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.482 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.483 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.484 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.485 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.486 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.487 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.488 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.489 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.490 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.491 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.492 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.493 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.494 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.495 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.496 12 INFO cotyledon._service [-] Caught SIGTERM signal, graceful exiting of service AgentManager(0) [12]
Dec 11 00:59:28 np0005554845 virtqemud[186638]: End of file while reading data: Input/output error
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197565]: 2025-12-11 05:59:28.510 2 DEBUG cotyledon._service_manager [-] Shutdown finish _shutdown /usr/lib/python3.9/site-packages/cotyledon/_service_manager.py:320
Dec 11 00:59:28 np0005554845 systemd[1]: libpod-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope: Deactivated successfully.
Dec 11 00:59:28 np0005554845 systemd[1]: libpod-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope: Consumed 1.386s CPU time.
Dec 11 00:59:28 np0005554845 podman[197750]: 2025-12-11 05:59:28.651978759 +0000 UTC m=+0.351440084 container died f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202)
Dec 11 00:59:28 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-469e9fcf4d3733c9.timer: Deactivated successfully.
Dec 11 00:59:28 np0005554845 systemd[1]: Stopped /usr/bin/podman healthcheck run f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.
Dec 11 00:59:28 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-userdata-shm.mount: Deactivated successfully.
Dec 11 00:59:28 np0005554845 systemd[1]: var-lib-containers-storage-overlay-d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc-merged.mount: Deactivated successfully.
Dec 11 00:59:28 np0005554845 podman[197750]: 2025-12-11 05:59:28.723256855 +0000 UTC m=+0.422718080 container cleanup f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:59:28 np0005554845 podman[197750]: ceilometer_agent_compute
Dec 11 00:59:28 np0005554845 podman[197784]: ceilometer_agent_compute
Dec 11 00:59:28 np0005554845 systemd[1]: edpm_ceilometer_agent_compute.service: Deactivated successfully.
Dec 11 00:59:28 np0005554845 systemd[1]: Stopped ceilometer_agent_compute container.
Dec 11 00:59:28 np0005554845 systemd[1]: Starting ceilometer_agent_compute container...
Dec 11 00:59:28 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:28 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/etc/ceilometer/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:28 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/etc/ceilometer/ceilometer_prom_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:28 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/var/lib/openstack/config supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:28 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d01ef11aedf19d4ddcd2d4b6a2b22d0469d31d81887ccf58acc663da74f456dc/merged/var/lib/kolla/config_files/config.json supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:28 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.
Dec 11 00:59:28 np0005554845 podman[197797]: 2025-12-11 05:59:28.979939565 +0000 UTC m=+0.145670847 container init f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 00:59:28 np0005554845 ceilometer_agent_compute[197813]: + sudo -E kolla_set_configs
Dec 11 00:59:29 np0005554845 podman[197797]: 2025-12-11 05:59:29.014436902 +0000 UTC m=+0.180168184 container start f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 00:59:29 np0005554845 podman[197797]: ceilometer_agent_compute
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: sudo: unable to send audit message: Operation not permitted
Dec 11 00:59:29 np0005554845 systemd[1]: Started ceilometer_agent_compute container.
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Loading config file at /var/lib/kolla/config_files/config.json
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Validating config file
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Kolla config strategy set to: COPY_ALWAYS
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Copying service configuration files
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer.conf to /etc/ceilometer/ceilometer.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Deleting /etc/ceilometer/polling.yaml
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Copying /var/lib/openstack/config/polling.yaml to /etc/ceilometer/polling.yaml
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Setting permission for /etc/ceilometer/polling.yaml
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Copying /var/lib/openstack/config/custom.conf to /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/01-ceilometer-custom.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Deleting /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Copying /var/lib/openstack/config/ceilometer-host-specific.conf to /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Setting permission for /etc/ceilometer/ceilometer.conf.d/02-ceilometer-host-specific.conf
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: INFO:__main__:Writing out command to execute
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: ++ cat /run_command
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + CMD='/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + ARGS=
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + sudo kolla_copy_cacerts
Dec 11 00:59:29 np0005554845 podman[197820]: 2025-12-11 05:59:29.101354012 +0000 UTC m=+0.065256733 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=1, health_log=, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:59:29 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 00:59:29 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Failed with result 'exit-code'.
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: sudo: unable to send audit message: Operation not permitted
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + [[ ! -n '' ]]
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + . kolla_extend_start
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: Running command: '/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + echo 'Running command: '\''/usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout'\'''
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + umask 0022
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: + exec /usr/bin/ceilometer-polling --polling-namespaces compute --logfile /dev/stdout
Dec 11 00:59:29 np0005554845 python3.9[197997]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/node_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.891 2 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_manager_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:40
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.891 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.891 2 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.891 2 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.892 2 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.893 2 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.894 2 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.895 2 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.896 2 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.897 2 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.898 2 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.899 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.900 2 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.901 2 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.902 2 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.903 2 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.904 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.905 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.906 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.907 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.908 2 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.926 12 INFO ceilometer.polling.manager [-] Looking for dynamic pollsters configurations at [['/etc/ceilometer/pollsters.d']].
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.928 12 INFO ceilometer.polling.manager [-] No dynamic pollsters found in folder [/etc/ceilometer/pollsters.d].
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.928 12 INFO ceilometer.polling.manager [-] No dynamic pollsters file found in dirs [['/etc/ceilometer/pollsters.d']].
Dec 11 00:59:29 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:29.941 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.061 12 DEBUG cotyledon.oslo_config_glue [-] Full set of CONF: _load_service_options /usr/lib/python3.9/site-packages/cotyledon/oslo_config_glue.py:48
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.061 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2589
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.061 12 DEBUG cotyledon.oslo_config_glue [-] Configuration options gathered from: log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2590
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.061 12 DEBUG cotyledon.oslo_config_glue [-] command line args: ['--polling-namespaces', 'compute', '--logfile', '/dev/stdout'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2591
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.061 12 DEBUG cotyledon.oslo_config_glue [-] config files: ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2592
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] ================================================================================ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2594
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] batch_size                     = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] cfg_file                       = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] config_dir                     = ['/etc/ceilometer/ceilometer.conf.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] config_file                    = ['/etc/ceilometer/ceilometer.conf'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] config_source                  = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] control_exchange               = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] debug                          = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.062 12 DEBUG cotyledon.oslo_config_glue [-] default_log_levels             = ['amqp=WARN', 'amqplib=WARN', 'boto=WARN', 'qpid=WARN', 'sqlalchemy=WARN', 'suds=INFO', 'oslo.messaging=INFO', 'oslo_messaging=INFO', 'iso8601=WARN', 'requests.packages.urllib3.connectionpool=WARN', 'urllib3.connectionpool=WARN', 'websocket=WARN', 'requests.packages.urllib3.util.retry=WARN', 'urllib3.util.retry=WARN', 'keystonemiddleware=WARN', 'routes.middleware=WARN', 'stevedore=WARN', 'taskflow=WARN', 'keystoneauth=WARN', 'oslo.cache=INFO', 'oslo_policy=INFO', 'dogpile.core.dogpile=INFO', 'futurist=INFO', 'neutronclient=INFO', 'keystoneclient=INFO'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] event_pipeline_cfg_file        = event_pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] graceful_shutdown_timeout      = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] host                           = compute-2.ctlplane.example.com log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] http_timeout                   = 600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] hypervisor_inspector           = libvirt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] instance_format                = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] instance_uuid_format           = [instance: %(uuid)s]  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_type                   = kvm log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] libvirt_uri                    =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] log_config_append              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.063 12 DEBUG cotyledon.oslo_config_glue [-] log_date_format                = %Y-%m-%d %H:%M:%S log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_dir                        = /var/log/ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_file                       = /dev/stdout log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_options                    = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval            = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_rotate_interval_type       = days log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] log_rotation_type              = none log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] logging_context_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [%(global_request_id)s %(request_id)s %(user_identity)s] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] logging_debug_format_suffix    = %(funcName)s %(pathname)s:%(lineno)d log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] logging_default_format_string  = %(asctime)s.%(msecs)03d %(process)d %(levelname)s %(name)s [-] %(instance)s%(message)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] logging_exception_prefix       = %(asctime)s.%(msecs)03d %(process)d ERROR %(name)s %(instance)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.064 12 DEBUG cotyledon.oslo_config_glue [-] logging_user_identity_format   = %(user)s %(project)s %(domain)s %(system_scope)s %(user_domain)s %(project_domain)s log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_count              = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] max_logfile_size_mb            = 200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] max_parallel_requests          = 64 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] partitioning_group_prefix      = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] pipeline_cfg_file              = pipeline.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] polling_namespaces             = ['compute'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] pollsters_definitions_dirs     = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] publish_errors                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_burst               = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_except_level        = CRITICAL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] rate_limit_interval            = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.065 12 DEBUG cotyledon.oslo_config_glue [-] reseller_prefix                = AUTH_ log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_keys         = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_length       = 256 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] reserved_metadata_namespace    = ['metering.'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] rootwrap_config                = /etc/ceilometer/rootwrap.conf log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] sample_source                  = openstack log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] syslog_log_facility            = LOG_USER log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] tenant_name_discovery          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] transport_url                  = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] use_eventlog                   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] use_journal                    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] use_json                       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.066 12 DEBUG cotyledon.oslo_config_glue [-] use_stderr                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] use_syslog                     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] watch_log_file                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2602
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] compute.instance_discovery_method = libvirt_metadata log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_cache_expiry  = 3600 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] compute.resource_update_interval = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] coordination.backend_url       = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] event.definitions_cfg_file     = event_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] event.drop_unmatched_notifications = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] event.store_raw                = [] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.node_manager_init_retry   = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] ipmi.polling_retry             = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.067 12 DEBUG cotyledon.oslo_config_glue [-] meter.meter_definitions_dirs   = ['/etc/ceilometer/meters.d', '/usr/lib/python3.9/site-packages/ceilometer/data/meters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_on_failure     = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.archive_path           = mon_pub_failures.txt log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_count            = 1000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_max_retries      = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_mode             = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_polling_interval = 5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.batch_timeout          = 15 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.068 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_max_retries     = 3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.client_retry_interval  = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.clientapi_version      = 2_0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cloud_name             = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.cluster                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.control_plane          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.enable_api_pagination  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.069 12 DEBUG cotyledon.oslo_config_glue [-] monasca.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.monasca_mappings       = /etc/ceilometer/monasca_field_definitions.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.retry_on_failure       = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] monasca.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] notification.ack_on_event_error = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_size        = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] notification.batch_timeout     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] notification.messaging_urls    = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.070 12 DEBUG cotyledon.oslo_config_glue [-] notification.notification_control_exchanges = ['nova', 'glance', 'neutron', 'cinder', 'heat', 'keystone', 'sahara', 'trove', 'zaqar', 'swift', 'ceilometer', 'magnum', 'dns', 'ironic', 'aodh'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] notification.pipelines         = ['meter', 'event'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] notification.workers           = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] polling.batch_size             = 50 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] polling.cfg_file               = polling.yaml log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] polling.partitioning_group_prefix = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] polling.pollsters_definitions_dirs = ['/etc/ceilometer/pollsters.d'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] polling.tenant_name_discovery  = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] publisher.telemetry_secret     = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.event_topic = event log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.071 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.metering_topic = metering log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] publisher_notifier.telemetry_driver = messagingv2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.access_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] rgw_admin_credentials.secret_key = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] rgw_client.implicit_tenants    = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.cinder           = volumev3 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.glance           = image log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.neutron          = network log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.nova             = compute log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.radosgw          = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.072 12 DEBUG cotyledon.oslo_config_glue [-] service_types.swift            = object-store log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.api_retry_count         = 10 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.ca_file                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_ip                 = 127.0.0.1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_password           = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_port               = 443 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.host_username           =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.insecure                = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.task_poll_interval      = 0.5 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] vmware.wsdl_location           = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.073 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_section = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_type  = password log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.auth_url   = https://keystone-internal.openstack.svc:5000 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.cafile     = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.certfile   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.collect_timing = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.default_domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_id  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.domain_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.insecure   = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.interface  = internalURL log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.074 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.keyfile    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.password   = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.project_name = service log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.region_name = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.split_loggers = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.system_scope = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.075 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.timeout    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.trust_id   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_id = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_domain_name = Default log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.user_id    = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] service_credentials.username   = ceilometer log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_section           = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.auth_type              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.cafile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.076 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.certfile               = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.collect_timing         = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.insecure               = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.interface              = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.keyfile                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.region_name            = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.split_loggers          = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] gnocchi.timeout                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_section             = service_credentials log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.077 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.auth_type                = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.cafile                   = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.certfile                 = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.collect_timing           = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.insecure                 = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.interface                = internal log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.keyfile                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.region_name              = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.split_loggers            = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] zaqar.timeout                  = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.driver = ['noop'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.retry = -1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.topics = ['notifications'] log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.078 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_notifications.transport_url = **** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_auto_delete = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.amqp_durable_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_min_size = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.conn_pool_ttl = 1200 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.direct_mandatory_flag = True log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.enable_cancel_on_failover = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_in_pthread = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_rate = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.heartbeat_timeout_threshold = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_compression = None log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.079 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_failover_strategy = round-robin log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_missing_consumer_retry_timeout = 60 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.kombu_reconnect_delay = 1.0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_ha_queues = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_interval_max = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_login_method = AMQPLAIN log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_qos_prefetch_count = 100 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_delivery_limit = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_bytes = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_max_memory_length = 0 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.080 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_quorum_queue = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_backoff = 2 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_retry_interval = 1 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rabbit_transient_queues_ttl = 1800 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.rpc_conn_pool_size = 30 log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl      = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_ca_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_cert_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.081 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_enforce_fips_mode = False log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.082 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_key_file =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.082 12 DEBUG cotyledon.oslo_config_glue [-] oslo_messaging_rabbit.ssl_version =  log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2609
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.082 12 DEBUG cotyledon.oslo_config_glue [-] ******************************************************************************** log_opt_values /usr/lib/python3.9/site-packages/oslo_config/cfg.py:2613
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.082 12 DEBUG cotyledon._service [-] Run service AgentManager(0) [12] wait_forever /usr/lib/python3.9/site-packages/cotyledon/_service.py:241
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.085 12 DEBUG ceilometer.agent [-] Config file: {'sources': [{'name': 'pollsters', 'interval': 120, 'meters': ['power.state', 'cpu', 'memory.usage', 'disk.*', 'network.*']}]} load_config /usr/lib/python3.9/site-packages/ceilometer/agent.py:64
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.093 12 DEBUG ceilometer.compute.virt.libvirt.utils [-] Connecting to libvirt: qemu:///system new_libvirt_connection /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/utils.py:93
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 05:59:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 00:59:30 np0005554845 python3.9[198126]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/node_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432769.3307774-1696-153643809134154/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:31 np0005554845 podman[198174]: 2025-12-11 05:59:31.145531621 +0000 UTC m=+0.071588815 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 00:59:31 np0005554845 podman[198175]: 2025-12-11 05:59:31.193275467 +0000 UTC m=+0.112451624 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 00:59:31 np0005554845 python3.9[198324]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=node_exporter.json debug=False
Dec 11 00:59:32 np0005554845 python3.9[198476]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:59:33 np0005554845 podman[198501]: 2025-12-11 05:59:33.161351499 +0000 UTC m=+0.082136601 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 00:59:33 np0005554845 python3[198648]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=node_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:59:33 np0005554845 podman[198686]: 2025-12-11 05:59:33.969584926 +0000 UTC m=+0.065723105 container create 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, container_name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, config_id=edpm)
Dec 11 00:59:33 np0005554845 podman[198686]: 2025-12-11 05:59:33.941585866 +0000 UTC m=+0.037724035 image pull 0da6a335fe1356545476b749c68f022c897de3a2139e8f0054f6937349ee2b83 quay.io/prometheus/node-exporter:v1.5.0
Dec 11 00:59:33 np0005554845 python3[198648]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name node_exporter --conmon-pidfile /run/node_exporter.pid --env OS_ENDPOINT_TYPE=internal --healthcheck-command /openstack/healthcheck node_exporter --label config_id=edpm --label container_name=node_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9100:9100 --user root --volume /var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z --volume /var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw --volume /var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z quay.io/prometheus/node-exporter:v1.5.0 --web.config.file=/etc/node_exporter/node_exporter.yaml --web.disable-exporter-metrics --collector.systemd --collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service --no-collector.dmi --no-collector.entropy --no-collector.thermal_zone --no-collector.time --no-collector.timex --no-collector.uname --no-collector.stat --no-collector.hwmon --no-collector.os --no-collector.selinux --no-collector.textfile --no-collector.powersupplyclass --no-collector.pressure --no-collector.rapl
Dec 11 00:59:35 np0005554845 python3.9[198876]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:59:36 np0005554845 python3.9[199030]: ansible-file Invoked with path=/etc/systemd/system/edpm_node_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:36 np0005554845 python3.9[199181]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432776.1310163-1855-231430683456986/source dest=/etc/systemd/system/edpm_node_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:37 np0005554845 python3.9[199257]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:59:37 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:37 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:37 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:38 np0005554845 python3.9[199367]: ansible-systemd Invoked with state=restarted name=edpm_node_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:59:38 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:38 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:38 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:38 np0005554845 systemd[1]: Starting node_exporter container...
Dec 11 00:59:38 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:38 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a4a28ebdcb4b1ba65bdcfb9c808edc8f4af85fea0fb1c69201aa291e0834369/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:38 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a4a28ebdcb4b1ba65bdcfb9c808edc8f4af85fea0fb1c69201aa291e0834369/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:38 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.
Dec 11 00:59:38 np0005554845 podman[199407]: 2025-12-11 05:59:38.941305581 +0000 UTC m=+0.166992955 container init 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.962Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.962Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.963Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.963Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.964Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.964Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.964Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.964Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=arp
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=bcache
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=bonding
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=cpu
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=edac
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=filefd
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=netclass
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=netdev
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=netstat
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=nfs
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=nvme
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=softnet
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=systemd
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=xfs
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.965Z caller=node_exporter.go:117 level=info collector=zfs
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.966Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 11 00:59:38 np0005554845 node_exporter[199422]: ts=2025-12-11T05:59:38.967Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 11 00:59:38 np0005554845 podman[199407]: 2025-12-11 05:59:38.982900391 +0000 UTC m=+0.208587725 container start 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 00:59:38 np0005554845 podman[199407]: node_exporter
Dec 11 00:59:38 np0005554845 systemd[1]: Started node_exporter container.
Dec 11 00:59:39 np0005554845 podman[199431]: 2025-12-11 05:59:39.086899055 +0000 UTC m=+0.085760150 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 00:59:40 np0005554845 python3.9[199604]: ansible-ansible.builtin.systemd Invoked with name=edpm_node_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:59:40 np0005554845 systemd[1]: Stopping node_exporter container...
Dec 11 00:59:40 np0005554845 systemd[1]: libpod-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.scope: Deactivated successfully.
Dec 11 00:59:40 np0005554845 podman[199608]: 2025-12-11 05:59:40.195832717 +0000 UTC m=+0.085150303 container died 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 00:59:40 np0005554845 systemd[1]: 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb-22464d3a1f258aef.timer: Deactivated successfully.
Dec 11 00:59:40 np0005554845 systemd[1]: Stopped /usr/bin/podman healthcheck run 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.
Dec 11 00:59:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb-userdata-shm.mount: Deactivated successfully.
Dec 11 00:59:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay-1a4a28ebdcb4b1ba65bdcfb9c808edc8f4af85fea0fb1c69201aa291e0834369-merged.mount: Deactivated successfully.
Dec 11 00:59:40 np0005554845 podman[199608]: 2025-12-11 05:59:40.248362373 +0000 UTC m=+0.137679959 container cleanup 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 00:59:40 np0005554845 podman[199608]: node_exporter
Dec 11 00:59:40 np0005554845 systemd[1]: edpm_node_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 00:59:40 np0005554845 podman[199635]: node_exporter
Dec 11 00:59:40 np0005554845 systemd[1]: edpm_node_exporter.service: Failed with result 'exit-code'.
Dec 11 00:59:40 np0005554845 systemd[1]: Stopped node_exporter container.
Dec 11 00:59:40 np0005554845 systemd[1]: Starting node_exporter container...
Dec 11 00:59:40 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:40 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a4a28ebdcb4b1ba65bdcfb9c808edc8f4af85fea0fb1c69201aa291e0834369/merged/etc/node_exporter/node_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:40 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/1a4a28ebdcb4b1ba65bdcfb9c808edc8f4af85fea0fb1c69201aa291e0834369/merged/etc/node_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:40 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.
Dec 11 00:59:40 np0005554845 podman[199648]: 2025-12-11 05:59:40.522618501 +0000 UTC m=+0.155842833 container init 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.542Z caller=node_exporter.go:180 level=info msg="Starting node_exporter" version="(version=1.5.0, branch=HEAD, revision=1b48970ffcf5630534fb00bb0687d73c66d1c959)"
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.542Z caller=node_exporter.go:181 level=info msg="Build context" build_context="(go=go1.19.3, user=root@6e7732a7b81b, date=20221129-18:59:09)"
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.542Z caller=node_exporter.go:183 level=warn msg="Node Exporter is running as root user. This exporter is designed to run as unprivileged user, root is not required."
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.543Z caller=filesystem_common.go:111 level=info collector=filesystem msg="Parsed flag --collector.filesystem.mount-points-exclude" flag=^/(dev|proc|run/credentials/.+|sys|var/lib/docker/.+|var/lib/containers/storage/.+)($|/)
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.543Z caller=filesystem_common.go:113 level=info collector=filesystem msg="Parsed flag --collector.filesystem.fs-types-exclude" flag=^(autofs|binfmt_misc|bpf|cgroup2?|configfs|debugfs|devpts|devtmpfs|fusectl|hugetlbfs|iso9660|mqueue|nsfs|overlay|proc|procfs|pstore|rpc_pipefs|securityfs|selinuxfs|squashfs|sysfs|tracefs)$
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.543Z caller=diskstats_common.go:111 level=info collector=diskstats msg="Parsed flag --collector.diskstats.device-exclude" flag=^(ram|loop|fd|(h|s|v|xv)d[a-z]|nvme\d+n\d+p)\d+$
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=diskstats_linux.go:264 level=error collector=diskstats msg="Failed to open directory, disabling udev device properties" path=/run/udev/data
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=systemd_linux.go:152 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-include" flag=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\.service
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=systemd_linux.go:154 level=info collector=systemd msg="Parsed flag --collector.systemd.unit-exclude" flag=.+\.(automount|device|mount|scope|slice)
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:110 level=info msg="Enabled collectors"
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=arp
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=bcache
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=bonding
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=btrfs
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=conntrack
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=cpu
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=cpufreq
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=diskstats
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=edac
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=fibrechannel
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=filefd
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=filesystem
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=infiniband
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=ipvs
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=loadavg
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=mdadm
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=meminfo
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.544Z caller=node_exporter.go:117 level=info collector=netclass
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=netdev
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=netstat
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=nfs
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=nfsd
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=nvme
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=schedstat
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=sockstat
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=softnet
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=systemd
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=tapestats
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=udp_queues
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=vmstat
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=xfs
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.545Z caller=node_exporter.go:117 level=info collector=zfs
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.546Z caller=tls_config.go:232 level=info msg="Listening on" address=[::]:9100
Dec 11 00:59:40 np0005554845 node_exporter[199663]: ts=2025-12-11T05:59:40.546Z caller=tls_config.go:268 level=info msg="TLS is enabled." http2=true address=[::]:9100
Dec 11 00:59:40 np0005554845 podman[199648]: 2025-12-11 05:59:40.555558666 +0000 UTC m=+0.188782948 container start 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 00:59:40 np0005554845 podman[199648]: node_exporter
Dec 11 00:59:40 np0005554845 systemd[1]: Started node_exporter container.
Dec 11 00:59:40 np0005554845 podman[199672]: 2025-12-11 05:59:40.644805578 +0000 UTC m=+0.071563924 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 00:59:41 np0005554845 python3.9[199847]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/podman_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:42 np0005554845 python3.9[199970]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/podman_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432780.8187459-1951-42435196513843/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:43 np0005554845 python3.9[200122]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=podman_exporter.json debug=False
Dec 11 00:59:43 np0005554845 python3.9[200274]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:59:45 np0005554845 python3[200426]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=podman_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 00:59:46 np0005554845 podman[200440]: 2025-12-11 05:59:46.796179359 +0000 UTC m=+1.549385370 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 11 00:59:47 np0005554845 podman[200538]: 2025-12-11 05:59:47.00513883 +0000 UTC m=+0.089270521 container create 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, config_id=edpm, container_name=podman_exporter)
Dec 11 00:59:47 np0005554845 podman[200538]: 2025-12-11 05:59:46.936200802 +0000 UTC m=+0.020332503 image pull e56d40e393eb5ea8704d9af8cf0d74665df83747106713fda91530f201837815 quay.io/navidys/prometheus-podman-exporter:v1.10.1
Dec 11 00:59:47 np0005554845 python3[200426]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name podman_exporter --conmon-pidfile /run/podman_exporter.pid --env OS_ENDPOINT_TYPE=internal --env CONTAINER_HOST=unix:///run/podman/podman.sock --healthcheck-command /openstack/healthcheck podman_exporter --label config_id=edpm --label container_name=podman_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9882:9882 --user root --volume /var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z --volume /run/podman/podman.sock:/run/podman/podman.sock:rw,z --volume /var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z quay.io/navidys/prometheus-podman-exporter:v1.10.1 --web.config.file=/etc/podman_exporter/podman_exporter.yaml
Dec 11 00:59:48 np0005554845 python3.9[200728]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 00:59:48 np0005554845 python3.9[200882]: ansible-file Invoked with path=/etc/systemd/system/edpm_podman_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:49 np0005554845 python3.9[201033]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432789.0569496-2110-159713115032616/source dest=/etc/systemd/system/edpm_podman_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 00:59:50 np0005554845 python3.9[201109]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 00:59:50 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:50 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:50 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:51 np0005554845 python3.9[201220]: ansible-systemd Invoked with state=restarted name=edpm_podman_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 00:59:51 np0005554845 systemd[1]: Reloading.
Dec 11 00:59:51 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 00:59:51 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 00:59:51 np0005554845 systemd[1]: Starting podman_exporter container...
Dec 11 00:59:51 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:51 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4ae276bdb5b7de0695c09e7971d63efe5930c6e1a7c5c7c4cc5f79238034245/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:51 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4ae276bdb5b7de0695c09e7971d63efe5930c6e1a7c5c7c4cc5f79238034245/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:51 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.
Dec 11 00:59:51 np0005554845 podman[201260]: 2025-12-11 05:59:51.903112586 +0000 UTC m=+0.122163347 container init 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 00:59:51 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:51.919Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 11 00:59:51 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:51.919Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 11 00:59:51 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:51.920Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 11 00:59:51 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:51.920Z caller=handler.go:105 level=info collector=container
Dec 11 00:59:51 np0005554845 podman[201260]: 2025-12-11 05:59:51.934876734 +0000 UTC m=+0.153927455 container start 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 00:59:51 np0005554845 podman[201260]: podman_exporter
Dec 11 00:59:51 np0005554845 systemd[1]: Starting Podman API Service...
Dec 11 00:59:51 np0005554845 systemd[1]: Started Podman API Service.
Dec 11 00:59:51 np0005554845 systemd[1]: Started podman_exporter container.
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="/usr/bin/podman filtering at log level info"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="Setting parallel job count to 25"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="Using sqlite as database backend"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="Not using native diff for overlay, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="Using systemd socket activation to determine API endpoint"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="API service listening on \"/run/podman/podman.sock\". URI: \"unix:///run/podman/podman.sock\""
Dec 11 00:59:51 np0005554845 podman[201287]: @ - - [11/Dec/2025:05:59:51 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 11 00:59:51 np0005554845 podman[201287]: time="2025-12-11T05:59:51Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 11 00:59:52 np0005554845 podman[201287]: @ - - [11/Dec/2025:05:59:51 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19569 "" "Go-http-client/1.1"
Dec 11 00:59:52 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:52.019Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 11 00:59:52 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:52.020Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 11 00:59:52 np0005554845 podman_exporter[201276]: ts=2025-12-11T05:59:52.021Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 11 00:59:52 np0005554845 podman[201286]: 2025-12-11 05:59:52.02926556 +0000 UTC m=+0.080326872 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 00:59:53 np0005554845 python3.9[201473]: ansible-ansible.builtin.systemd Invoked with name=edpm_podman_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 00:59:53 np0005554845 systemd[1]: Stopping podman_exporter container...
Dec 11 00:59:53 np0005554845 podman[201287]: @ - - [11/Dec/2025:05:59:51 +0000] "GET /v4.9.3/libpod/events?filters=%7B%7D&since=&stream=true&until= HTTP/1.1" 200 1448 "" "Go-http-client/1.1"
Dec 11 00:59:53 np0005554845 systemd[1]: libpod-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.scope: Deactivated successfully.
Dec 11 00:59:53 np0005554845 podman[201477]: 2025-12-11 05:59:53.256013327 +0000 UTC m=+0.052811729 container died 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 00:59:53 np0005554845 systemd[1]: 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a-129832b5a4636c67.timer: Deactivated successfully.
Dec 11 00:59:53 np0005554845 systemd[1]: Stopped /usr/bin/podman healthcheck run 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.
Dec 11 00:59:53 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a-userdata-shm.mount: Deactivated successfully.
Dec 11 00:59:53 np0005554845 systemd[1]: var-lib-containers-storage-overlay-d4ae276bdb5b7de0695c09e7971d63efe5930c6e1a7c5c7c4cc5f79238034245-merged.mount: Deactivated successfully.
Dec 11 00:59:53 np0005554845 podman[201477]: 2025-12-11 05:59:53.593665239 +0000 UTC m=+0.390463661 container cleanup 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 00:59:53 np0005554845 podman[201477]: podman_exporter
Dec 11 00:59:53 np0005554845 systemd[1]: edpm_podman_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 00:59:53 np0005554845 podman[201506]: podman_exporter
Dec 11 00:59:53 np0005554845 systemd[1]: edpm_podman_exporter.service: Failed with result 'exit-code'.
Dec 11 00:59:53 np0005554845 systemd[1]: Stopped podman_exporter container.
Dec 11 00:59:53 np0005554845 systemd[1]: Starting podman_exporter container...
Dec 11 00:59:53 np0005554845 systemd[1]: Started libcrun container.
Dec 11 00:59:53 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4ae276bdb5b7de0695c09e7971d63efe5930c6e1a7c5c7c4cc5f79238034245/merged/etc/podman_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:53 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d4ae276bdb5b7de0695c09e7971d63efe5930c6e1a7c5c7c4cc5f79238034245/merged/etc/podman_exporter/podman_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 00:59:53 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.
Dec 11 00:59:53 np0005554845 podman[201519]: 2025-12-11 05:59:53.899526833 +0000 UTC m=+0.170742733 container init 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.922Z caller=exporter.go:68 level=info msg="Starting podman-prometheus-exporter" version="(version=1.10.1, branch=HEAD, revision=1)"
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.922Z caller=exporter.go:69 level=info msg=metrics enhanced=false
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.922Z caller=handler.go:94 level=info msg="enabled collectors"
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.922Z caller=handler.go:105 level=info collector=container
Dec 11 00:59:53 np0005554845 podman[201287]: @ - - [11/Dec/2025:05:59:53 +0000] "GET /v4.9.3/libpod/_ping HTTP/1.1" 200 2 "" "Go-http-client/1.1"
Dec 11 00:59:53 np0005554845 podman[201287]: time="2025-12-11T05:59:53Z" level=info msg="List containers: received `last` parameter - overwriting `limit`"
Dec 11 00:59:53 np0005554845 podman[201519]: 2025-12-11 05:59:53.935465162 +0000 UTC m=+0.206680972 container start 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 00:59:53 np0005554845 podman[201519]: podman_exporter
Dec 11 00:59:53 np0005554845 systemd[1]: Started podman_exporter container.
Dec 11 00:59:53 np0005554845 podman[201287]: @ - - [11/Dec/2025:05:59:53 +0000] "GET /v4.9.3/libpod/containers/json?all=true&external=false&last=0&namespace=false&size=true&sync=false HTTP/1.1" 200 19571 "" "Go-http-client/1.1"
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.951Z caller=exporter.go:96 level=info msg="Listening on" address=:9882
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.951Z caller=tls_config.go:313 level=info msg="Listening on" address=[::]:9882
Dec 11 00:59:53 np0005554845 podman_exporter[201534]: ts=2025-12-11T05:59:53.952Z caller=tls_config.go:349 level=info msg="TLS is enabled." http2=true address=[::]:9882
Dec 11 00:59:54 np0005554845 podman[201544]: 2025-12-11 05:59:54.009688281 +0000 UTC m=+0.061695756 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 00:59:55 np0005554845 python3.9[201718]: ansible-ansible.legacy.stat Invoked with path=/var/lib/openstack/healthchecks/openstack_network_exporter/healthcheck follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 00:59:55 np0005554845 python3.9[201841]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/openstack/healthchecks/openstack_network_exporter/ group=zuul mode=0700 owner=zuul setype=container_file_t src=/home/zuul/.ansible/tmp/ansible-tmp-1765432794.5350528-2206-13917406626021/.source _original_basename=healthcheck follow=False checksum=e380c11c36804bfc65a818f2960cfa663daacfe5 backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None attributes=None
Dec 11 00:59:56 np0005554845 python3.9[201993]: ansible-container_config_data Invoked with config_overrides={} config_path=/var/lib/openstack/config/telemetry config_pattern=openstack_network_exporter.json debug=False
Dec 11 00:59:56 np0005554845 auditd[701]: Audit daemon rotating log files
Dec 11 00:59:57 np0005554845 python3.9[202145]: ansible-container_config_hash Invoked with check_mode=False config_vol_prefix=/var/lib/config-data
Dec 11 00:59:58 np0005554845 python3[202297]: ansible-edpm_container_manage Invoked with concurrency=1 config_dir=/var/lib/openstack/config/telemetry config_id=edpm config_overrides={} config_patterns=openstack_network_exporter.json log_base_path=/var/log/containers/stdouts debug=False
Dec 11 01:00:00 np0005554845 podman[202338]: 2025-12-11 06:00:00.150782091 +0000 UTC m=+0.076727657 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=starting, health_failing_streak=2, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:00:00 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 01:00:00 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Failed with result 'exit-code'.
Dec 11 01:00:01 np0005554845 podman[202405]: 2025-12-11 06:00:01.405422041 +0000 UTC m=+0.177016210 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:00:01 np0005554845 podman[202406]: 2025-12-11 06:00:01.442450898 +0000 UTC m=+0.213687548 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_controller, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202)
Dec 11 01:00:01 np0005554845 podman[202312]: 2025-12-11 06:00:01.447281157 +0000 UTC m=+2.677268190 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 11 01:00:01 np0005554845 podman[202471]: 2025-12-11 06:00:01.670697433 +0000 UTC m=+0.105566715 container create cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, version=9.6, io.openshift.tags=minimal rhel9, release=1755695350, name=ubi9-minimal, config_id=edpm, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., io.openshift.expose-services=, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 01:00:01 np0005554845 podman[202471]: 2025-12-11 06:00:01.608908256 +0000 UTC m=+0.043777538 image pull 186c5e97c6f6912533851a0044ea6da23938910e7bddfb4a6c0be9b48ab2a1d1 quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 11 01:00:01 np0005554845 python3[202297]: ansible-edpm_container_manage PODMAN-CONTAINER-DEBUG: podman create --name openstack_network_exporter --conmon-pidfile /run/openstack_network_exporter.pid --env OS_ENDPOINT_TYPE=internal --env OPENSTACK_NETWORK_EXPORTER_YAML=/etc/openstack_network_exporter/openstack_network_exporter.yaml --healthcheck-command /openstack/healthcheck openstack-netwo --label config_id=edpm --label container_name=openstack_network_exporter --label managed_by=edpm_ansible --label config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']} --log-driver journald --log-level info --network host --privileged=True --publish 9105:9105 --volume /var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z --volume /var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z --volume /var/run/openvswitch:/run/openvswitch:rw,z --volume /var/lib/openvswitch/ovn:/run/ovn:rw,z --volume /proc:/host/proc:ro --volume /var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified
Dec 11 01:00:04 np0005554845 podman[202634]: 2025-12-11 06:00:04.135750975 +0000 UTC m=+0.067275644 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:00:04 np0005554845 python3.9[202683]: ansible-ansible.builtin.stat Invoked with path=/etc/sysconfig/podman_drop_in follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 01:00:05 np0005554845 python3.9[202838]: ansible-file Invoked with path=/etc/systemd/system/edpm_openstack_network_exporter.requires state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:00:05 np0005554845 python3.9[202989]: ansible-copy Invoked with src=/home/zuul/.ansible/tmp/ansible-tmp-1765432805.3046417-2365-187204536201783/source dest=/etc/systemd/system/edpm_openstack_network_exporter.service mode=0644 owner=root group=root backup=False force=True remote_src=False follow=False unsafe_writes=False _original_basename=None content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None checksum=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:00:06 np0005554845 python3.9[203065]: ansible-systemd Invoked with daemon_reload=True daemon_reexec=False scope=system no_block=False name=None state=None enabled=None force=None masked=None
Dec 11 01:00:06 np0005554845 systemd[1]: Reloading.
Dec 11 01:00:06 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 01:00:06 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 01:00:07 np0005554845 python3.9[203176]: ansible-systemd Invoked with state=restarted name=edpm_openstack_network_exporter.service enabled=True daemon_reload=False daemon_reexec=False scope=system no_block=False force=None masked=None
Dec 11 01:00:07 np0005554845 systemd[1]: Reloading.
Dec 11 01:00:07 np0005554845 systemd-rc-local-generator: /etc/rc.d/rc.local is not marked executable, skipping.
Dec 11 01:00:07 np0005554845 systemd-sysv-generator: SysV service '/etc/rc.d/init.d/network' lacks a native systemd unit file. Automatically generating a unit file for compatibility. Please update package to include a native systemd unit file, in order to make it more safe and robust.
Dec 11 01:00:08 np0005554845 systemd[1]: Starting openstack_network_exporter container...
Dec 11 01:00:08 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:00:08 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:08 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:08 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:08 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.
Dec 11 01:00:08 np0005554845 podman[203216]: 2025-12-11 06:00:08.183209506 +0000 UTC m=+0.151835600 container init cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, vcs-type=git, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, name=ubi9-minimal, io.buildah.version=1.33.7, architecture=x86_64, io.openshift.tags=minimal rhel9, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *bridge.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *coverage.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *datapath.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *iface.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *memory.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *ovnnorthd.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *ovn.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *ovsdbserver.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *pmd_perf.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *pmd_rxq.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: INFO    06:00:08 main.go:48: registering *vswitch.Collector
Dec 11 01:00:08 np0005554845 openstack_network_exporter[203232]: NOTICE  06:00:08 main.go:76: listening on https://:9105/metrics
Dec 11 01:00:08 np0005554845 podman[203216]: 2025-12-11 06:00:08.213298507 +0000 UTC m=+0.181924591 container start cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, version=9.6, config_id=edpm, name=ubi9-minimal, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 01:00:08 np0005554845 podman[203216]: openstack_network_exporter
Dec 11 01:00:08 np0005554845 systemd[1]: Started openstack_network_exporter container.
Dec 11 01:00:08 np0005554845 podman[203242]: 2025-12-11 06:00:08.291972125 +0000 UTC m=+0.068103066 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_id=edpm, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public)
Dec 11 01:00:09 np0005554845 python3.9[203416]: ansible-ansible.builtin.systemd Invoked with name=edpm_openstack_network_exporter.service state=restarted daemon_reload=False daemon_reexec=False scope=system no_block=False enabled=None force=None masked=None
Dec 11 01:00:09 np0005554845 systemd[1]: Stopping openstack_network_exporter container...
Dec 11 01:00:09 np0005554845 systemd[1]: libpod-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.scope: Deactivated successfully.
Dec 11 01:00:09 np0005554845 podman[203420]: 2025-12-11 06:00:09.255988947 +0000 UTC m=+0.073837710 container died cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible, name=ubi9-minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, vcs-type=git, architecture=x86_64, io.openshift.tags=minimal rhel9)
Dec 11 01:00:09 np0005554845 systemd[1]: cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5-1b0e1c6de8f73551.timer: Deactivated successfully.
Dec 11 01:00:09 np0005554845 systemd[1]: Stopped /usr/bin/podman healthcheck run cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.
Dec 11 01:00:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5-userdata-shm.mount: Deactivated successfully.
Dec 11 01:00:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay-ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76-merged.mount: Deactivated successfully.
Dec 11 01:00:10 np0005554845 podman[203420]: 2025-12-11 06:00:10.557228818 +0000 UTC m=+1.375077611 container cleanup cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., distribution-scope=public, io.openshift.expose-services=, config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, architecture=x86_64)
Dec 11 01:00:10 np0005554845 podman[203420]: openstack_network_exporter
Dec 11 01:00:10 np0005554845 systemd[1]: edpm_openstack_network_exporter.service: Main process exited, code=exited, status=2/INVALIDARGUMENT
Dec 11 01:00:10 np0005554845 podman[203448]: openstack_network_exporter
Dec 11 01:00:10 np0005554845 systemd[1]: edpm_openstack_network_exporter.service: Failed with result 'exit-code'.
Dec 11 01:00:10 np0005554845 systemd[1]: Stopped openstack_network_exporter container.
Dec 11 01:00:10 np0005554845 systemd[1]: Starting openstack_network_exporter container...
Dec 11 01:00:10 np0005554845 podman[203461]: 2025-12-11 06:00:10.795748829 +0000 UTC m=+0.088153382 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:00:10 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:00:10 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/run/ovn supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:10 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/etc/openstack_network_exporter/openstack_network_exporter.yaml supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:10 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ce4f6c5a2fba9a093fe5ed4fc479dc6cec3f7a9b0ab228d1fc1fecd9e3e81c76/merged/etc/openstack_network_exporter/tls supports timestamps until 2038 (0x7fffffff)
Dec 11 01:00:10 np0005554845 systemd[1]: Started /usr/bin/podman healthcheck run cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.
Dec 11 01:00:10 np0005554845 podman[203462]: 2025-12-11 06:00:10.86409729 +0000 UTC m=+0.150690128 container init cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, distribution-scope=public, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, name=ubi9-minimal, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, vcs-type=git, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *bridge.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *coverage.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *datapath.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *iface.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *memory.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *ovnnorthd.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *ovn.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *ovsdbserver.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *pmd_perf.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *pmd_rxq.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: INFO    06:00:10 main.go:48: registering *vswitch.Collector
Dec 11 01:00:10 np0005554845 openstack_network_exporter[203488]: NOTICE  06:00:10 main.go:76: listening on https://:9105/metrics
Dec 11 01:00:10 np0005554845 podman[203462]: 2025-12-11 06:00:10.890772892 +0000 UTC m=+0.177365640 container start cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.openshift.expose-services=, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, distribution-scope=public)
Dec 11 01:00:10 np0005554845 podman[203462]: openstack_network_exporter
Dec 11 01:00:10 np0005554845 systemd[1]: Started openstack_network_exporter container.
Dec 11 01:00:10 np0005554845 podman[203508]: 2025-12-11 06:00:10.969346967 +0000 UTC m=+0.063449713 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, release=1755695350, managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, distribution-scope=public, name=ubi9-minimal)
Dec 11 01:00:11 np0005554845 python3.9[203681]: ansible-ansible.builtin.find Invoked with file_type=directory paths=['/var/lib/openstack/healthchecks/'] patterns=[] read_whole_file=False age_stamp=mtime recurse=False hidden=False follow=False get_checksum=False checksum_algorithm=sha1 use_regex=False exact_mode=True excludes=None contains=None age=None size=None depth=None mode=None encoding=None limit=None
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.106 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.107 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.128 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.128 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.128 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.138 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.139 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.140 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.140 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.168 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.168 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.168 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.168 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.342 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.343 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5849MB free_disk=73.36510848999023GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.344 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.344 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.415 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.416 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.453 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.478 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.481 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:00:17 np0005554845 nova_compute[187128]: 2025-12-11 06:00:17.481 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:00:18 np0005554845 nova_compute[187128]: 2025-12-11 06:00:18.032 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:18 np0005554845 nova_compute[187128]: 2025-12-11 06:00:18.033 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:18 np0005554845 nova_compute[187128]: 2025-12-11 06:00:18.033 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:00:18 np0005554845 nova_compute[187128]: 2025-12-11 06:00:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:18 np0005554845 nova_compute[187128]: 2025-12-11 06:00:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:00:24 np0005554845 podman[203706]: 2025-12-11 06:00:24.168860373 +0000 UTC m=+0.097122680 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:00:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:00:26.212 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:00:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:00:26.212 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:00:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:00:26.212 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:00:31 np0005554845 podman[203732]: 2025-12-11 06:00:31.133456206 +0000 UTC m=+0.065132677 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=3, health_log=, config_id=edpm, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:00:31 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 01:00:31 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Failed with result 'exit-code'.
Dec 11 01:00:32 np0005554845 podman[203751]: 2025-12-11 06:00:32.177358979 +0000 UTC m=+0.099159555 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 11 01:00:32 np0005554845 podman[203752]: 2025-12-11 06:00:32.204443201 +0000 UTC m=+0.122452886 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 11 01:00:35 np0005554845 podman[203794]: 2025-12-11 06:00:35.166987127 +0000 UTC m=+0.089929668 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd)
Dec 11 01:00:41 np0005554845 podman[203815]: 2025-12-11 06:00:41.125251052 +0000 UTC m=+0.060028342 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:00:41 np0005554845 podman[203816]: 2025-12-11 06:00:41.129019512 +0000 UTC m=+0.061682435 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 01:00:55 np0005554845 podman[203857]: 2025-12-11 06:00:55.175343726 +0000 UTC m=+0.099748900 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:01:01 np0005554845 podman[203995]: 2025-12-11 06:01:01.684970334 +0000 UTC m=+0.076119043 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=unhealthy, health_failing_streak=4, health_log=, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:01:01 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Main process exited, code=exited, status=1/FAILURE
Dec 11 01:01:01 np0005554845 systemd[1]: f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d-478f6f1371c7a10a.service: Failed with result 'exit-code'.
Dec 11 01:01:01 np0005554845 python3.9[204040]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_controller'] executable=podman
Dec 11 01:01:02 np0005554845 podman[204179]: 2025-12-11 06:01:02.752458217 +0000 UTC m=+0.068559549 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 01:01:02 np0005554845 podman[204180]: 2025-12-11 06:01:02.816556595 +0000 UTC m=+0.126274205 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 11 01:01:02 np0005554845 python3.9[204241]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:03 np0005554845 systemd[1]: Started libpod-conmon-a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1.scope.
Dec 11 01:01:03 np0005554845 podman[204251]: 2025-12-11 06:01:03.07348987 +0000 UTC m=+0.096665877 container exec a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:01:03 np0005554845 podman[204251]: 2025-12-11 06:01:03.104875825 +0000 UTC m=+0.128051822 container exec_died a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:01:03 np0005554845 systemd[1]: libpod-conmon-a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1.scope: Deactivated successfully.
Dec 11 01:01:03 np0005554845 python3.9[204432]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_controller detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:04 np0005554845 systemd[1]: Started libpod-conmon-a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1.scope.
Dec 11 01:01:04 np0005554845 podman[204433]: 2025-12-11 06:01:04.540592464 +0000 UTC m=+0.539344558 container exec a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2)
Dec 11 01:01:04 np0005554845 podman[204451]: 2025-12-11 06:01:04.612588114 +0000 UTC m=+0.056290498 container exec_died a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 11 01:01:04 np0005554845 podman[204433]: 2025-12-11 06:01:04.617890097 +0000 UTC m=+0.616642181 container exec_died a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller)
Dec 11 01:01:04 np0005554845 systemd[1]: libpod-conmon-a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1.scope: Deactivated successfully.
Dec 11 01:01:05 np0005554845 python3.9[204615]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_controller recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:06 np0005554845 podman[204739]: 2025-12-11 06:01:06.006012593 +0000 UTC m=+0.094602122 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:01:06 np0005554845 python3.9[204787]: ansible-containers.podman.podman_container_info Invoked with name=['ovn_metadata_agent'] executable=podman
Dec 11 01:01:07 np0005554845 python3.9[204953]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:07 np0005554845 systemd[1]: Started libpod-conmon-63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0.scope.
Dec 11 01:01:07 np0005554845 podman[204954]: 2025-12-11 06:01:07.198395991 +0000 UTC m=+0.175957664 container exec 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:01:07 np0005554845 podman[204954]: 2025-12-11 06:01:07.232880351 +0000 UTC m=+0.210442014 container exec_died 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:01:07 np0005554845 systemd[1]: libpod-conmon-63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0.scope: Deactivated successfully.
Dec 11 01:01:08 np0005554845 python3.9[205136]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ovn_metadata_agent detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:08 np0005554845 systemd[1]: Started libpod-conmon-63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0.scope.
Dec 11 01:01:08 np0005554845 podman[205137]: 2025-12-11 06:01:08.47755832 +0000 UTC m=+0.083043110 container exec 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:01:08 np0005554845 podman[205137]: 2025-12-11 06:01:08.487882838 +0000 UTC m=+0.093367598 container exec_died 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:01:08 np0005554845 systemd[1]: libpod-conmon-63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0.scope: Deactivated successfully.
Dec 11 01:01:09 np0005554845 python3.9[205321]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/ovn_metadata_agent recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:10 np0005554845 python3.9[205473]: ansible-containers.podman.podman_container_info Invoked with name=['multipathd'] executable=podman
Dec 11 01:01:10 np0005554845 python3.9[205638]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:10 np0005554845 systemd[1]: Started libpod-conmon-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.scope.
Dec 11 01:01:10 np0005554845 podman[205639]: 2025-12-11 06:01:10.894157266 +0000 UTC m=+0.067642765 container exec eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:01:10 np0005554845 podman[205639]: 2025-12-11 06:01:10.930161866 +0000 UTC m=+0.103647375 container exec_died eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:01:10 np0005554845 systemd[1]: libpod-conmon-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.scope: Deactivated successfully.
Dec 11 01:01:11 np0005554845 podman[205792]: 2025-12-11 06:01:11.489172054 +0000 UTC m=+0.068347194 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 01:01:11 np0005554845 podman[205793]: 2025-12-11 06:01:11.489098212 +0000 UTC m=+0.065395444 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.openshift.expose-services=, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, maintainer=Red Hat, Inc., vcs-type=git, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm)
Dec 11 01:01:11 np0005554845 python3.9[205862]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=multipathd detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:11 np0005554845 systemd[1]: Started libpod-conmon-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.scope.
Dec 11 01:01:11 np0005554845 podman[205863]: 2025-12-11 06:01:11.78471846 +0000 UTC m=+0.083599645 container exec eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 01:01:11 np0005554845 podman[205863]: 2025-12-11 06:01:11.820871564 +0000 UTC m=+0.119752749 container exec_died eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, config_id=multipathd, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:01:11 np0005554845 systemd[1]: libpod-conmon-eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec.scope: Deactivated successfully.
Dec 11 01:01:12 np0005554845 python3.9[206044]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/multipathd recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:13 np0005554845 python3.9[206196]: ansible-containers.podman.podman_container_info Invoked with name=['ceilometer_agent_compute'] executable=podman
Dec 11 01:01:14 np0005554845 python3.9[206361]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:14 np0005554845 systemd[1]: Started libpod-conmon-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope.
Dec 11 01:01:14 np0005554845 podman[206362]: 2025-12-11 06:01:14.128710639 +0000 UTC m=+0.076960896 container exec f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:01:14 np0005554845 podman[206382]: 2025-12-11 06:01:14.193542576 +0000 UTC m=+0.051732735 container exec_died f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_id=edpm, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Dec 11 01:01:14 np0005554845 podman[206362]: 2025-12-11 06:01:14.198766597 +0000 UTC m=+0.147016834 container exec_died f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3)
Dec 11 01:01:14 np0005554845 systemd[1]: libpod-conmon-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope: Deactivated successfully.
Dec 11 01:01:14 np0005554845 python3.9[206546]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=ceilometer_agent_compute detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:15 np0005554845 systemd[1]: Started libpod-conmon-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope.
Dec 11 01:01:15 np0005554845 podman[206547]: 2025-12-11 06:01:15.093284728 +0000 UTC m=+0.203998760 container exec f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:01:15 np0005554845 podman[206566]: 2025-12-11 06:01:15.241645537 +0000 UTC m=+0.134586318 container exec_died f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 01:01:15 np0005554845 podman[206547]: 2025-12-11 06:01:15.247567647 +0000 UTC m=+0.358281619 container exec_died f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute)
Dec 11 01:01:15 np0005554845 systemd[1]: libpod-conmon-f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d.scope: Deactivated successfully.
Dec 11 01:01:16 np0005554845 python3.9[206730]: ansible-ansible.builtin.file Invoked with group=42405 mode=0700 owner=42405 path=/var/lib/openstack/healthchecks/ceilometer_agent_compute recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.719 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.719 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.720 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.720 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.893 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.895 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5918MB free_disk=73.36479187011719GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.895 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.895 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.946 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.947 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.964 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.984 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.988 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:01:16 np0005554845 nova_compute[187128]: 2025-12-11 06:01:16.989 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.093s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:01:17 np0005554845 python3.9[206882]: ansible-containers.podman.podman_container_info Invoked with name=['node_exporter'] executable=podman
Dec 11 01:01:17 np0005554845 nova_compute[187128]: 2025-12-11 06:01:17.991 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:17 np0005554845 nova_compute[187128]: 2025-12-11 06:01:17.991 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:17 np0005554845 python3.9[207047]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:18 np0005554845 systemd[1]: Started libpod-conmon-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.scope.
Dec 11 01:01:18 np0005554845 podman[207048]: 2025-12-11 06:01:18.098287714 +0000 UTC m=+0.088358243 container exec 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:01:18 np0005554845 podman[207067]: 2025-12-11 06:01:18.161591311 +0000 UTC m=+0.052692512 container exec_died 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:01:18 np0005554845 podman[207048]: 2025-12-11 06:01:18.167977173 +0000 UTC m=+0.158047702 container exec_died 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:01:18 np0005554845 systemd[1]: libpod-conmon-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.scope: Deactivated successfully.
Dec 11 01:01:18 np0005554845 nova_compute[187128]: 2025-12-11 06:01:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:18 np0005554845 nova_compute[187128]: 2025-12-11 06:01:18.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:01:18 np0005554845 nova_compute[187128]: 2025-12-11 06:01:18.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:01:18 np0005554845 nova_compute[187128]: 2025-12-11 06:01:18.709 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:01:18 np0005554845 nova_compute[187128]: 2025-12-11 06:01:18.709 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:18 np0005554845 python3.9[207231]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=node_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:19 np0005554845 systemd[1]: Started libpod-conmon-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.scope.
Dec 11 01:01:19 np0005554845 podman[207232]: 2025-12-11 06:01:19.027011356 +0000 UTC m=+0.072901746 container exec 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:01:19 np0005554845 podman[207232]: 2025-12-11 06:01:19.057293023 +0000 UTC m=+0.103183393 container exec_died 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:01:19 np0005554845 systemd[1]: libpod-conmon-1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb.scope: Deactivated successfully.
Dec 11 01:01:19 np0005554845 nova_compute[187128]: 2025-12-11 06:01:19.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:19 np0005554845 nova_compute[187128]: 2025-12-11 06:01:19.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:19 np0005554845 nova_compute[187128]: 2025-12-11 06:01:19.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:01:19 np0005554845 python3.9[207416]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/node_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:20 np0005554845 python3.9[207568]: ansible-containers.podman.podman_container_info Invoked with name=['podman_exporter'] executable=podman
Dec 11 01:01:20 np0005554845 nova_compute[187128]: 2025-12-11 06:01:20.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:01:21 np0005554845 python3.9[207735]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:21 np0005554845 systemd[1]: Started libpod-conmon-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.scope.
Dec 11 01:01:21 np0005554845 podman[207736]: 2025-12-11 06:01:21.309441886 +0000 UTC m=+0.077963572 container exec 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:01:21 np0005554845 podman[207736]: 2025-12-11 06:01:21.619881084 +0000 UTC m=+0.388402750 container exec_died 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:01:21 np0005554845 systemd[1]: libpod-conmon-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.scope: Deactivated successfully.
Dec 11 01:01:22 np0005554845 python3.9[207919]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=podman_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:22 np0005554845 systemd[1]: Started libpod-conmon-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.scope.
Dec 11 01:01:22 np0005554845 podman[207920]: 2025-12-11 06:01:22.591001679 +0000 UTC m=+0.081197829 container exec 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:01:22 np0005554845 podman[207920]: 2025-12-11 06:01:22.626919837 +0000 UTC m=+0.117115997 container exec_died 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:01:22 np0005554845 systemd[1]: libpod-conmon-4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a.scope: Deactivated successfully.
Dec 11 01:01:23 np0005554845 python3.9[208103]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/podman_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:24 np0005554845 python3.9[208255]: ansible-containers.podman.podman_container_info Invoked with name=['openstack_network_exporter'] executable=podman
Dec 11 01:01:25 np0005554845 python3.9[208420]: ansible-containers.podman.podman_container_exec Invoked with command=id -u name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:25 np0005554845 systemd[1]: Started libpod-conmon-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.scope.
Dec 11 01:01:25 np0005554845 podman[208421]: 2025-12-11 06:01:25.094721294 +0000 UTC m=+0.072741332 container exec cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, name=ubi9-minimal, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, com.redhat.component=ubi9-minimal-container, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 11 01:01:25 np0005554845 podman[208421]: 2025-12-11 06:01:25.100557121 +0000 UTC m=+0.078577199 container exec_died cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, io.buildah.version=1.33.7, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, distribution-scope=public, container_name=openstack_network_exporter, managed_by=edpm_ansible, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., release=1755695350)
Dec 11 01:01:25 np0005554845 systemd[1]: libpod-conmon-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.scope: Deactivated successfully.
Dec 11 01:01:25 np0005554845 podman[208577]: 2025-12-11 06:01:25.661230764 +0000 UTC m=+0.070903673 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:01:25 np0005554845 python3.9[208629]: ansible-containers.podman.podman_container_exec Invoked with command=id -g name=openstack_network_exporter detach=False executable=podman privileged=False tty=False argv=None env=None user=None workdir=None
Dec 11 01:01:25 np0005554845 systemd[1]: Started libpod-conmon-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.scope.
Dec 11 01:01:25 np0005554845 podman[208630]: 2025-12-11 06:01:25.944139579 +0000 UTC m=+0.074539571 container exec cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, config_id=edpm, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, distribution-scope=public, vcs-type=git, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, io.buildah.version=1.33.7)
Dec 11 01:01:25 np0005554845 podman[208630]: 2025-12-11 06:01:25.976814409 +0000 UTC m=+0.107214371 container exec_died cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.buildah.version=1.33.7, version=9.6, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., container_name=openstack_network_exporter, distribution-scope=public, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container)
Dec 11 01:01:26 np0005554845 systemd[1]: libpod-conmon-cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5.scope: Deactivated successfully.
Dec 11 01:01:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:01:26.213 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:01:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:01:26.213 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:01:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:01:26.213 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:01:26 np0005554845 python3.9[208811]: ansible-ansible.builtin.file Invoked with group=0 mode=0700 owner=0 path=/var/lib/openstack/healthchecks/openstack_network_exporter recurse=True state=directory force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:27 np0005554845 python3.9[208963]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall/ state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:28 np0005554845 python3.9[209115]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/telemetry.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:28 np0005554845 python3.9[209238]: ansible-ansible.legacy.copy Invoked with dest=/var/lib/edpm-config/firewall/telemetry.yaml mode=0640 src=/home/zuul/.ansible/tmp/ansible-tmp-1765432887.80376-3209-21933303908929/.source.yaml follow=False _original_basename=firewall.yaml.j2 checksum=d942d984493b214bda2913f753ff68cdcedff00e backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:29 np0005554845 python3.9[209390]: ansible-ansible.builtin.file Invoked with group=root mode=0750 owner=root path=/var/lib/edpm-config/firewall state=directory recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.094 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.095 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:01:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:01:30 np0005554845 python3.9[209542]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:31 np0005554845 python3.9[209620]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml _original_basename=base-rules.yaml.j2 recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-base.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:31 np0005554845 podman[209744]: 2025-12-11 06:01:31.889116578 +0000 UTC m=+0.069034383 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:01:32 np0005554845 python3.9[209791]: ansible-ansible.legacy.stat Invoked with path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:32 np0005554845 python3.9[209870]: ansible-ansible.legacy.file Invoked with mode=0644 dest=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml _original_basename=.oiht7dxy recurse=False state=file path=/var/lib/edpm-config/firewall/edpm-nftables-user-rules.yaml force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:33 np0005554845 podman[209926]: 2025-12-11 06:01:33.144296059 +0000 UTC m=+0.061999802 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:01:33 np0005554845 podman[209945]: 2025-12-11 06:01:33.175687905 +0000 UTC m=+0.086716508 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, io.buildah.version=1.41.3, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 01:01:33 np0005554845 python3.9[210063]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/iptables.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:33 np0005554845 python3.9[210141]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/iptables.nft _original_basename=iptables.nft recurse=False state=file path=/etc/nftables/iptables.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:34 np0005554845 python3.9[210293]: ansible-ansible.legacy.command Invoked with _raw_params=nft -j list ruleset _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 01:01:36 np0005554845 podman[210447]: 2025-12-11 06:01:36.164595837 +0000 UTC m=+0.086586424 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:01:36 np0005554845 python3[210446]: ansible-edpm_nftables_from_files Invoked with src=/var/lib/edpm-config/firewall
Dec 11 01:01:37 np0005554845 python3.9[210621]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:38 np0005554845 python3.9[210699]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:39 np0005554845 python3.9[210851]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-update-jumps.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:40 np0005554845 python3.9[210929]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-update-jumps.nft _original_basename=jump-chain.j2 recurse=False state=file path=/etc/nftables/edpm-update-jumps.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:41 np0005554845 python3.9[211081]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-flushes.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:41 np0005554845 python3.9[211159]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-flushes.nft _original_basename=flush-chain.j2 recurse=False state=file path=/etc/nftables/edpm-flushes.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:42 np0005554845 podman[211283]: 2025-12-11 06:01:42.085891537 +0000 UTC m=+0.061192880 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:01:42 np0005554845 podman[211284]: 2025-12-11 06:01:42.12641115 +0000 UTC m=+0.088003913 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vendor=Red Hat, Inc., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, version=9.6, name=ubi9-minimal, release=1755695350, build-date=2025-08-20T13:12:41)
Dec 11 01:01:42 np0005554845 python3.9[211346]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-chains.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:42 np0005554845 python3.9[211432]: ansible-ansible.legacy.file Invoked with group=root mode=0600 owner=root dest=/etc/nftables/edpm-chains.nft _original_basename=chains.j2 recurse=False state=file path=/etc/nftables/edpm-chains.nft force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:43 np0005554845 python3.9[211584]: ansible-ansible.legacy.stat Invoked with path=/etc/nftables/edpm-rules.nft follow=False get_checksum=True get_size=False checksum_algorithm=sha1 get_mime=True get_attributes=True get_selinux_context=False
Dec 11 01:01:44 np0005554845 python3.9[211709]: ansible-ansible.legacy.copy Invoked with dest=/etc/nftables/edpm-rules.nft group=root mode=0600 owner=root src=/home/zuul/.ansible/tmp/ansible-tmp-1765432903.0796306-3584-18824848134089/.source.nft follow=False _original_basename=ruleset.j2 checksum=fb3275eced3a2e06312143189928124e1b2df34a backup=False force=True remote_src=False unsafe_writes=False content=NOT_LOGGING_PARAMETER validate=None directory_mode=None local_follow=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:45 np0005554845 python3.9[211861]: ansible-ansible.builtin.file Invoked with group=root mode=0600 owner=root path=/etc/nftables/edpm-rules.nft.changed state=touch recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:45 np0005554845 python3.9[212013]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-chains.nft /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft /etc/nftables/edpm-jumps.nft | nft -c -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 01:01:46 np0005554845 python3.9[212168]: ansible-ansible.builtin.blockinfile Invoked with backup=False block=include "/etc/nftables/iptables.nft"#012include "/etc/nftables/edpm-chains.nft"#012include "/etc/nftables/edpm-rules.nft"#012include "/etc/nftables/edpm-jumps.nft"#012 path=/etc/sysconfig/nftables.conf validate=nft -c -f %s state=present marker=# {mark} ANSIBLE MANAGED BLOCK create=False marker_begin=BEGIN marker_end=END append_newline=False prepend_newline=False encoding=utf-8 unsafe_writes=False insertafter=None insertbefore=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:47 np0005554845 python3.9[212320]: ansible-ansible.legacy.command Invoked with _raw_params=nft -f /etc/nftables/edpm-chains.nft _uses_shell=False expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 01:01:48 np0005554845 python3.9[212473]: ansible-ansible.builtin.stat Invoked with path=/etc/nftables/edpm-rules.nft.changed follow=False get_checksum=True get_mime=True get_attributes=True get_selinux_context=False checksum_algorithm=sha1
Dec 11 01:01:49 np0005554845 python3.9[212627]: ansible-ansible.legacy.command Invoked with _raw_params=set -o pipefail; cat /etc/nftables/edpm-flushes.nft /etc/nftables/edpm-rules.nft /etc/nftables/edpm-update-jumps.nft | nft -f - _uses_shell=True expand_argument_vars=True stdin_add_newline=True strip_empty_ends=True cmd=None argv=None chdir=None executable=None creates=None removes=None stdin=None
Dec 11 01:01:50 np0005554845 python3.9[212782]: ansible-ansible.builtin.file Invoked with path=/etc/nftables/edpm-rules.nft.changed state=absent recurse=False force=False follow=True modification_time_format=%Y%m%d%H%M.%S access_time_format=%Y%m%d%H%M.%S unsafe_writes=False _original_basename=None _diff_peek=None src=None modification_time=None access_time=None mode=None owner=None group=None seuser=None serole=None selevel=None setype=None attributes=None
Dec 11 01:01:50 np0005554845 systemd[1]: session-27.scope: Deactivated successfully.
Dec 11 01:01:50 np0005554845 systemd[1]: session-27.scope: Consumed 1min 50.963s CPU time.
Dec 11 01:01:50 np0005554845 systemd-logind[789]: Session 27 logged out. Waiting for processes to exit.
Dec 11 01:01:50 np0005554845 systemd-logind[789]: Removed session 27.
Dec 11 01:01:56 np0005554845 podman[212807]: 2025-12-11 06:01:56.114545679 +0000 UTC m=+0.051971159 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:02:02 np0005554845 podman[212831]: 2025-12-11 06:02:02.134412086 +0000 UTC m=+0.071250041 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:02:04 np0005554845 podman[212852]: 2025-12-11 06:02:04.124469963 +0000 UTC m=+0.051676460 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:02:04 np0005554845 podman[212853]: 2025-12-11 06:02:04.20479286 +0000 UTC m=+0.123161497 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:02:07 np0005554845 podman[212897]: 2025-12-11 06:02:07.140015099 +0000 UTC m=+0.066405399 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:02:13 np0005554845 podman[212918]: 2025-12-11 06:02:13.13931813 +0000 UTC m=+0.059896734 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:02:13 np0005554845 podman[212919]: 2025-12-11 06:02:13.152595599 +0000 UTC m=+0.063182102 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, name=ubi9-minimal, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, architecture=x86_64)
Dec 11 01:02:17 np0005554845 nova_compute[187128]: 2025-12-11 06:02:17.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:17 np0005554845 nova_compute[187128]: 2025-12-11 06:02:17.709 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.726 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.882 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.884 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6000MB free_disk=73.36785125732422GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.884 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.884 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.954 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.955 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.974 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.990 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.992 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:02:18 np0005554845 nova_compute[187128]: 2025-12-11 06:02:18.992 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:02:19 np0005554845 nova_compute[187128]: 2025-12-11 06:02:19.992 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:19 np0005554845 nova_compute[187128]: 2025-12-11 06:02:19.993 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:02:19 np0005554845 nova_compute[187128]: 2025-12-11 06:02:19.993 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:02:20 np0005554845 nova_compute[187128]: 2025-12-11 06:02:20.018 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:02:20 np0005554845 nova_compute[187128]: 2025-12-11 06:02:20.019 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:20 np0005554845 nova_compute[187128]: 2025-12-11 06:02:20.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:20 np0005554845 nova_compute[187128]: 2025-12-11 06:02:20.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:21 np0005554845 nova_compute[187128]: 2025-12-11 06:02:21.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:02:21 np0005554845 nova_compute[187128]: 2025-12-11 06:02:21.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:02:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:26.215 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:02:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:26.215 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:02:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:26.216 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:02:27 np0005554845 podman[212961]: 2025-12-11 06:02:27.139145582 +0000 UTC m=+0.062189477 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:02:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:27.627 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=2, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=1) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:02:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:27.629 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 0 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:02:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:02:27.631 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '2'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:02:33 np0005554845 podman[212986]: 2025-12-11 06:02:33.119357855 +0000 UTC m=+0.058089465 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:02:35 np0005554845 podman[213006]: 2025-12-11 06:02:35.107621043 +0000 UTC m=+0.045320028 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent)
Dec 11 01:02:35 np0005554845 podman[213007]: 2025-12-11 06:02:35.192902323 +0000 UTC m=+0.126124707 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 01:02:38 np0005554845 podman[213055]: 2025-12-11 06:02:38.122141671 +0000 UTC m=+0.059904114 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:02:44 np0005554845 podman[213076]: 2025-12-11 06:02:44.11104398 +0000 UTC m=+0.046417538 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:02:44 np0005554845 podman[213077]: 2025-12-11 06:02:44.130497247 +0000 UTC m=+0.058871636 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, vcs-type=git, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., release=1755695350, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 01:02:58 np0005554845 podman[213120]: 2025-12-11 06:02:58.119269623 +0000 UTC m=+0.052993567 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:03:04 np0005554845 podman[213144]: 2025-12-11 06:03:04.136231386 +0000 UTC m=+0.068088566 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:03:06 np0005554845 podman[213166]: 2025-12-11 06:03:06.122843592 +0000 UTC m=+0.051772793 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, managed_by=edpm_ansible, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202)
Dec 11 01:03:06 np0005554845 podman[213167]: 2025-12-11 06:03:06.161753017 +0000 UTC m=+0.081125479 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:03:09 np0005554845 podman[213212]: 2025-12-11 06:03:09.14742851 +0000 UTC m=+0.080254006 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:03:15 np0005554845 podman[213232]: 2025-12-11 06:03:15.122558319 +0000 UTC m=+0.054698303 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:03:15 np0005554845 podman[213233]: 2025-12-11 06:03:15.128074689 +0000 UTC m=+0.052357759 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, io.openshift.expose-services=, vendor=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, version=9.6, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible)
Dec 11 01:03:16 np0005554845 nova_compute[187128]: 2025-12-11 06:03:16.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:16 np0005554845 nova_compute[187128]: 2025-12-11 06:03:16.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:03:17 np0005554845 nova_compute[187128]: 2025-12-11 06:03:17.076 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:03:17 np0005554845 nova_compute[187128]: 2025-12-11 06:03:17.077 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:17 np0005554845 nova_compute[187128]: 2025-12-11 06:03:17.077 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:03:17 np0005554845 nova_compute[187128]: 2025-12-11 06:03:17.201 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.215 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.216 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.337 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.338 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.338 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.339 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.550 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.551 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6029MB free_disk=73.36554336547852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.552 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.552 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.677 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.677 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.703 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.788 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.789 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.809 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.834 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.859 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.892 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.896 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:03:19 np0005554845 nova_compute[187128]: 2025-12-11 06:03:19.896 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.344s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.373 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.373 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:03:20 np0005554845 nova_compute[187128]: 2025-12-11 06:03:20.720 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:03:21 np0005554845 nova_compute[187128]: 2025-12-11 06:03:21.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:21 np0005554845 nova_compute[187128]: 2025-12-11 06:03:21.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:22 np0005554845 nova_compute[187128]: 2025-12-11 06:03:22.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:22 np0005554845 nova_compute[187128]: 2025-12-11 06:03:22.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:03:22 np0005554845 nova_compute[187128]: 2025-12-11 06:03:22.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:03:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:03:26.216 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:03:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:03:26.217 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:03:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:03:26.218 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:03:29 np0005554845 podman[213276]: 2025-12-11 06:03:29.130820348 +0000 UTC m=+0.060951812 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.096 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.097 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:03:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:03:35 np0005554845 podman[213301]: 2025-12-11 06:03:35.139908417 +0000 UTC m=+0.074249402 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Dec 11 01:03:37 np0005554845 podman[213321]: 2025-12-11 06:03:37.124296635 +0000 UTC m=+0.056973915 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 01:03:37 np0005554845 podman[213322]: 2025-12-11 06:03:37.143221858 +0000 UTC m=+0.072268269 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, org.label-schema.license=GPLv2)
Dec 11 01:03:40 np0005554845 podman[213368]: 2025-12-11 06:03:40.164599317 +0000 UTC m=+0.087833589 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:03:46 np0005554845 podman[213389]: 2025-12-11 06:03:46.119792577 +0000 UTC m=+0.057969402 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:03:46 np0005554845 podman[213390]: 2025-12-11 06:03:46.153172422 +0000 UTC m=+0.078032276 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, vendor=Red Hat, Inc., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, architecture=x86_64, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public)
Dec 11 01:04:00 np0005554845 podman[213436]: 2025-12-11 06:04:00.132767484 +0000 UTC m=+0.060894091 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:04:06 np0005554845 podman[213460]: 2025-12-11 06:04:06.155588077 +0000 UTC m=+0.082492562 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:04:08 np0005554845 podman[213480]: 2025-12-11 06:04:08.125705295 +0000 UTC m=+0.056684205 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 11 01:04:08 np0005554845 podman[213481]: 2025-12-11 06:04:08.162612474 +0000 UTC m=+0.088183188 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202)
Dec 11 01:04:11 np0005554845 podman[213525]: 2025-12-11 06:04:11.142157569 +0000 UTC m=+0.074670081 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Dec 11 01:04:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:15.765 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=3, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=2) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:04:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:15.766 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:04:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:16.768 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '3'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:17 np0005554845 podman[213546]: 2025-12-11 06:04:17.152643819 +0000 UTC m=+0.074410114 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.openshift.expose-services=, distribution-scope=public, build-date=2025-08-20T13:12:41, name=ubi9-minimal, managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 01:04:17 np0005554845 podman[213545]: 2025-12-11 06:04:17.157330986 +0000 UTC m=+0.081434995 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.724 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.726 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.946 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.948 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=6057MB free_disk=73.36556243896484GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.948 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:18 np0005554845 nova_compute[187128]: 2025-12-11 06:04:18.948 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.207 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.207 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.230 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.258 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.260 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:04:19 np0005554845 nova_compute[187128]: 2025-12-11 06:04:19.260 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.312s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:20 np0005554845 nova_compute[187128]: 2025-12-11 06:04:20.262 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:20 np0005554845 nova_compute[187128]: 2025-12-11 06:04:20.688 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:20 np0005554845 nova_compute[187128]: 2025-12-11 06:04:20.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:21 np0005554845 nova_compute[187128]: 2025-12-11 06:04:21.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:21 np0005554845 nova_compute[187128]: 2025-12-11 06:04:21.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:21 np0005554845 nova_compute[187128]: 2025-12-11 06:04:21.708 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:04:21 np0005554845 nova_compute[187128]: 2025-12-11 06:04:21.708 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:04:21 np0005554845 nova_compute[187128]: 2025-12-11 06:04:21.724 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:04:23 np0005554845 nova_compute[187128]: 2025-12-11 06:04:23.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:23 np0005554845 nova_compute[187128]: 2025-12-11 06:04:23.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:23 np0005554845 nova_compute[187128]: 2025-12-11 06:04:23.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:24 np0005554845 nova_compute[187128]: 2025-12-11 06:04:24.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:04:24 np0005554845 nova_compute[187128]: 2025-12-11 06:04:24.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:04:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:26.217 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:26.218 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:26.218 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:31 np0005554845 podman[213588]: 2025-12-11 06:04:31.118002791 +0000 UTC m=+0.053723414 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:04:37 np0005554845 podman[213612]: 2025-12-11 06:04:37.174047214 +0000 UTC m=+0.101955120 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:04:39 np0005554845 podman[213633]: 2025-12-11 06:04:39.143527024 +0000 UTC m=+0.067529388 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:04:39 np0005554845 podman[213634]: 2025-12-11 06:04:39.230465137 +0000 UTC m=+0.150578116 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 11 01:04:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:39.630 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e8:00:13 10.100.0.2 2001:db8::f816:3eff:fee8:13'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fee8:13/64', 'neutron:device_id': 'ovnmeta-8284078c-5f26-4a11-84b2-d1d04f724407', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8284078c-5f26-4a11-84b2-d1d04f724407', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8e2b2cc6-73f3-4f9c-b34b-8bc169a8406c, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=7fd976d5-e5fa-4565-97e8-ea40d4141153) old=Port_Binding(mac=['fa:16:3e:e8:00:13 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-8284078c-5f26-4a11-84b2-d1d04f724407', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8284078c-5f26-4a11-84b2-d1d04f724407', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:04:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:39.632 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 7fd976d5-e5fa-4565-97e8-ea40d4141153 in datapath 8284078c-5f26-4a11-84b2-d1d04f724407 updated#033[00m
Dec 11 01:04:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:39.638 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8284078c-5f26-4a11-84b2-d1d04f724407, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:04:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:39.639 104320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.default', '--privsep_sock_path', '/tmp/tmpekvbnetq/privsep.sock']#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.341 104320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.342 104320 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpekvbnetq/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.199 213683 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.203 213683 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.205 213683 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_NET_ADMIN|CAP_SYS_ADMIN|CAP_SYS_PTRACE/none#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.205 213683 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213683#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.345 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[79931c32-ee4a-4846-9275-231575f72722]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.876 213683 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.876 213683 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.876 213683 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:40.975 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0ef921bd-f662-438a-aae3-0bee18b5ae9f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:04:42 np0005554845 podman[213688]: 2025-12-11 06:04:42.138497267 +0000 UTC m=+0.074248271 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd)
Dec 11 01:04:48 np0005554845 podman[213709]: 2025-12-11 06:04:48.139629474 +0000 UTC m=+0.061926887 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, release=1755695350, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, distribution-scope=public, config_id=edpm, name=ubi9-minimal, vcs-type=git, version=9.6, architecture=x86_64, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc.)
Dec 11 01:04:48 np0005554845 podman[213708]: 2025-12-11 06:04:48.155403821 +0000 UTC m=+0.075527665 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.442 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.443 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.474 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.601 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.601 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.610 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.610 187132 INFO nova.compute.claims [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.705 187132 DEBUG nova.compute.provider_tree [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.718 187132 DEBUG nova.scheduler.client.report [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.737 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.136s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.738 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.785 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.785 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.812 187132 INFO nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.832 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.932 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.933 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.934 187132 INFO nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Creating image(s)#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.935 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.935 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.936 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.936 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:48 np0005554845 nova_compute[187128]: 2025-12-11 06:04:48.937 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.199 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.200 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.219 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.308 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.308 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.315 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.316 187132 INFO nova.compute.claims [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.458 187132 DEBUG nova.compute.provider_tree [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.476 187132 DEBUG nova.scheduler.client.report [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.516 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.208s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.517 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.597 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.597 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.642 187132 INFO nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.680 187132 WARNING oslo_policy.policy [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.680 187132 WARNING oslo_policy.policy [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] JSON formatted policy_file support is deprecated since Victoria release. You need to use YAML format which will be default in future. You can use ``oslopolicy-convert-json-to-yaml`` tool to convert existing JSON-formatted policy file to YAML-formatted in backward compatible way: https://docs.openstack.org/oslo.policy/latest/cli/oslopolicy-convert-json-to-yaml.html.#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.684 187132 DEBUG nova.policy [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.689 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.805 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.806 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.807 187132 INFO nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Creating image(s)#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.807 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.808 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.809 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.809 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:49 np0005554845 nova_compute[187128]: 2025-12-11 06:04:49.923 187132 DEBUG nova.policy [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:04:50 np0005554845 nova_compute[187128]: 2025-12-11 06:04:50.828 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:50 np0005554845 nova_compute[187128]: 2025-12-11 06:04:50.902 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.part --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:50 np0005554845 nova_compute[187128]: 2025-12-11 06:04:50.904 187132 DEBUG nova.virt.images [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] 8999c077-a9de-4930-873b-81a3bd2d6c5f was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec 11 01:04:50 np0005554845 nova_compute[187128]: 2025-12-11 06:04:50.905 187132 DEBUG nova.privsep.utils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec 11 01:04:50 np0005554845 nova_compute[187128]: 2025-12-11 06:04:50.905 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.part /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.129 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.part /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.converted" returned: 0 in 0.224s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.134 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.220 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165.converted --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.222 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 2.285s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.239 187132 INFO oslo.privsep.daemon [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'nova.privsep.sys_admin_pctxt', '--privsep_sock_path', '/tmp/tmpq1vs1r9q/privsep.sock']#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.240 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 1.431s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.241 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.893 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Successfully created port: 6b225150-8014-4488-91e5-7faf65ace151 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.982 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Successfully created port: f7e89a08-ebcf-4928-85e4-e649df5a3196 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.988 187132 INFO oslo.privsep.daemon [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.863 213770 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.868 213770 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.870 213770 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_CHOWN|CAP_DAC_OVERRIDE|CAP_DAC_READ_SEARCH|CAP_FOWNER|CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.871 213770 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213770#033[00m
Dec 11 01:04:51 np0005554845 nova_compute[187128]: 2025-12-11 06:04:51.992 187132 WARNING oslo_privsep.priv_context [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] privsep daemon already running#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.063 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.080 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.117 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.118 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.118 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.134 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.146 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.147 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.182 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.183 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.216 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.218 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.099s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.219 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.237 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.262 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.295 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.297 187132 DEBUG nova.virt.disk.api [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Checking if we can resize image /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.297 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.322 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.323 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.347 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.348 187132 DEBUG nova.virt.disk.api [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Cannot resize image /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.348 187132 DEBUG nova.objects.instance [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'migration_context' on Instance uuid f2c66e64-57a7-4e97-8552-80a9d24397f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.373 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.373 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Ensure instance console log exists: /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.374 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.374 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.374 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.729 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk 1073741824" returned: 0 in 0.405s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.730 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.493s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.731 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.790 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.791 187132 DEBUG nova.virt.disk.api [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.792 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.878 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.880 187132 DEBUG nova.virt.disk.api [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.881 187132 DEBUG nova.objects.instance [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b205ab7-6622-4644-a404-ec948480d1ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.898 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.899 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Ensure instance console log exists: /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.900 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.900 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:52 np0005554845 nova_compute[187128]: 2025-12-11 06:04:52.901 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.480 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Successfully updated port: f7e89a08-ebcf-4928-85e4-e649df5a3196 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.496 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.497 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.497 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.763 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:04:54 np0005554845 nova_compute[187128]: 2025-12-11 06:04:54.983 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Successfully updated port: 6b225150-8014-4488-91e5-7faf65ace151 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:04:55 np0005554845 nova_compute[187128]: 2025-12-11 06:04:55.004 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:04:55 np0005554845 nova_compute[187128]: 2025-12-11 06:04:55.004 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquired lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:04:55 np0005554845 nova_compute[187128]: 2025-12-11 06:04:55.005 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:04:55 np0005554845 nova_compute[187128]: 2025-12-11 06:04:55.365 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.145 187132 DEBUG nova.compute.manager [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.146 187132 DEBUG nova.compute.manager [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing instance network info cache due to event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.146 187132 DEBUG oslo_concurrency.lockutils [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.608 187132 DEBUG nova.network.neutron [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.632 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.633 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance network_info: |[{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.634 187132 DEBUG oslo_concurrency.lockutils [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.634 187132 DEBUG nova.network.neutron [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.640 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Start _get_guest_xml network_info=[{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.648 187132 WARNING nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.653 187132 DEBUG nova.virt.libvirt.host [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.655 187132 DEBUG nova.virt.libvirt.host [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.668 187132 DEBUG nova.virt.libvirt.host [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.670 187132 DEBUG nova.virt.libvirt.host [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.672 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.673 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.674 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.674 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.675 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.675 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.676 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.676 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.676 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.677 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.677 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.678 187132 DEBUG nova.virt.hardware [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.685 187132 DEBUG nova.privsep.utils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.687 187132 DEBUG nova.virt.libvirt.vif [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:04:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1446310862',display_name='tempest-TestNetworkAdvancedServerOps-server-1446310862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1446310862',id=2,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgM/qAsq/W6pl2lUgKsTugKsHWIU0fM/qatNQRqhtCy4/LE7hhfrHzJklvICL0fI2w3nlVpvfyHHrtL8lBcnC0/CqZJ2+IvhPWma1ca7/i1wyykXdHh2hWxolw5MKjldw==',key_name='tempest-TestNetworkAdvancedServerOps-1888270068',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-kodglh7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:04:49Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=4b205ab7-6622-4644-a404-ec948480d1ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.688 187132 DEBUG nova.network.os_vif_util [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.689 187132 DEBUG nova.network.os_vif_util [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.691 187132 DEBUG nova.objects.instance [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4b205ab7-6622-4644-a404-ec948480d1ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.711 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <uuid>4b205ab7-6622-4644-a404-ec948480d1ba</uuid>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <name>instance-00000002</name>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1446310862</nova:name>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:04:56</nova:creationTime>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        <nova:port uuid="f7e89a08-ebcf-4928-85e4-e649df5a3196">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="serial">4b205ab7-6622-4644-a404-ec948480d1ba</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="uuid">4b205ab7-6622-4644-a404-ec948480d1ba</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:5f:c0:6b"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <target dev="tapf7e89a08-eb"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/console.log" append="off"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:04:56 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:04:56 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:04:56 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:04:56 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.713 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Preparing to wait for external event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.713 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.714 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.714 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.715 187132 DEBUG nova.virt.libvirt.vif [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:04:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1446310862',display_name='tempest-TestNetworkAdvancedServerOps-server-1446310862',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1446310862',id=2,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgM/qAsq/W6pl2lUgKsTugKsHWIU0fM/qatNQRqhtCy4/LE7hhfrHzJklvICL0fI2w3nlVpvfyHHrtL8lBcnC0/CqZJ2+IvhPWma1ca7/i1wyykXdHh2hWxolw5MKjldw==',key_name='tempest-TestNetworkAdvancedServerOps-1888270068',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-kodglh7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:04:49Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=4b205ab7-6622-4644-a404-ec948480d1ba,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.715 187132 DEBUG nova.network.os_vif_util [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.716 187132 DEBUG nova.network.os_vif_util [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.716 187132 DEBUG os_vif [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.760 187132 DEBUG ovsdbapp.backend.ovs_idl [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Created schema index Interface.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.760 187132 DEBUG ovsdbapp.backend.ovs_idl [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Created schema index Port.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.760 187132 DEBUG ovsdbapp.backend.ovs_idl [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Created schema index Bridge.name autocreate_indices /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/__init__.py:106#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.761 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] tcp:127.0.0.1:6640: entering CONNECTING _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.762 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [POLLOUT] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.762 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.763 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.764 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.767 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.780 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.780 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.781 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:04:56 np0005554845 nova_compute[187128]: 2025-12-11 06:04:56.781 187132 INFO oslo.privsep.daemon [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running privsep helper: ['sudo', 'nova-rootwrap', '/etc/nova/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/nova/nova.conf', '--config-file', '/etc/nova/nova-compute.conf', '--config-dir', '/etc/nova/nova.conf.d', '--privsep_context', 'vif_plug_ovs.privsep.vif_plug', '--privsep_sock_path', '/tmp/tmpb7m46eed/privsep.sock']#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.066 187132 DEBUG nova.compute.manager [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-changed-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.067 187132 DEBUG nova.compute.manager [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing instance network info cache due to event network-changed-6b225150-8014-4488-91e5-7faf65ace151. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.068 187132 DEBUG oslo_concurrency.lockutils [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.173 187132 DEBUG nova.network.neutron [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.425 187132 INFO oslo.privsep.daemon [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.292 213807 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.300 213807 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.306 213807 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_DAC_OVERRIDE|CAP_NET_ADMIN/CAP_DAC_OVERRIDE|CAP_NET_ADMIN/none#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.306 213807 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213807#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.717 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Releasing lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.718 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Instance network_info: |[{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.719 187132 DEBUG oslo_concurrency.lockutils [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.719 187132 DEBUG nova.network.neutron [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing network info cache for port 6b225150-8014-4488-91e5-7faf65ace151 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.725 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Start _get_guest_xml network_info=[{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.731 187132 WARNING nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.733 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.734 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf7e89a08-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.735 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf7e89a08-eb, col_values=(('external_ids', {'iface-id': 'f7e89a08-ebcf-4928-85e4-e649df5a3196', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5f:c0:6b', 'vm-uuid': '4b205ab7-6622-4644-a404-ec948480d1ba'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:57 np0005554845 NetworkManager[55529]: <info>  [1765433097.7390] manager: (tapf7e89a08-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/21)
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.739 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.741 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.742 187132 DEBUG nova.virt.libvirt.host [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.743 187132 DEBUG nova.virt.libvirt.host [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.746 187132 DEBUG nova.virt.libvirt.host [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.746 187132 DEBUG nova.virt.libvirt.host [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.747 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.748 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.748 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.748 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.748 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.749 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.750 187132 DEBUG nova.virt.hardware [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.753 187132 DEBUG nova.virt.libvirt.vif [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021466613',display_name='tempest-TestNetworkBasicOps-server-1021466613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021466613',id=1,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+j5FDWj2GbHAPAE98Uw4tQgPQNj9jziabv99iPtZbTEcOFL2RudLP/QAtVoXbHMhkSVxf71retgLVhjIxHZe68LaLI6P9zas5/bYFBwjZF2VjQRJddZnimyVDztq19nQ==',key_name='tempest-TestNetworkBasicOps-1647276356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-nfr2nn4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:04:48Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=f2c66e64-57a7-4e97-8552-80a9d24397f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.754 187132 DEBUG nova.network.os_vif_util [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.755 187132 DEBUG nova.network.os_vif_util [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.755 187132 DEBUG nova.objects.instance [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'pci_devices' on Instance uuid f2c66e64-57a7-4e97-8552-80a9d24397f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.759 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.760 187132 INFO os_vif [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb')#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.833 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <uuid>f2c66e64-57a7-4e97-8552-80a9d24397f6</uuid>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <name>instance-00000001</name>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkBasicOps-server-1021466613</nova:name>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:04:57</nova:creationTime>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:user uuid="3b482a000b3e4b5c964be05bad2a0418">tempest-TestNetworkBasicOps-1486719489-project-member</nova:user>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:project uuid="fce35ab888e44e46b3108813dcdf4163">tempest-TestNetworkBasicOps-1486719489</nova:project>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        <nova:port uuid="6b225150-8014-4488-91e5-7faf65ace151">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="serial">f2c66e64-57a7-4e97-8552-80a9d24397f6</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="uuid">f2c66e64-57a7-4e97-8552-80a9d24397f6</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.config"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:8c:57:fd"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <target dev="tap6b225150-80"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/console.log" append="off"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:04:57 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:04:57 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:04:57 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:04:57 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.834 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Preparing to wait for external event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.835 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.835 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.836 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.836 187132 DEBUG nova.virt.libvirt.vif [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021466613',display_name='tempest-TestNetworkBasicOps-server-1021466613',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021466613',id=1,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+j5FDWj2GbHAPAE98Uw4tQgPQNj9jziabv99iPtZbTEcOFL2RudLP/QAtVoXbHMhkSVxf71retgLVhjIxHZe68LaLI6P9zas5/bYFBwjZF2VjQRJddZnimyVDztq19nQ==',key_name='tempest-TestNetworkBasicOps-1647276356',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-nfr2nn4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:04:48Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=f2c66e64-57a7-4e97-8552-80a9d24397f6,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.836 187132 DEBUG nova.network.os_vif_util [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.837 187132 DEBUG nova.network.os_vif_util [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.837 187132 DEBUG os_vif [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.838 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.839 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.841 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.841 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap6b225150-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.842 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap6b225150-80, col_values=(('external_ids', {'iface-id': '6b225150-8014-4488-91e5-7faf65ace151', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8c:57:fd', 'vm-uuid': 'f2c66e64-57a7-4e97-8552-80a9d24397f6'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.843 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 NetworkManager[55529]: <info>  [1765433097.8439] manager: (tap6b225150-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/22)
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.844 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.849 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:57 np0005554845 nova_compute[187128]: 2025-12-11 06:04:57.850 187132 INFO os_vif [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80')#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.015 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.015 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.015 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:5f:c0:6b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.016 187132 INFO nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Using config drive#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.040 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.041 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.041 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No VIF found with MAC fa:16:3e:8c:57:fd, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:04:58 np0005554845 nova_compute[187128]: 2025-12-11 06:04:58.041 187132 INFO nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Using config drive#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.416 187132 INFO nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Creating config drive at /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.config#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.427 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph2nhum0_ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.448 187132 INFO nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Creating config drive at /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.453 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcozz2ax9 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.553 187132 DEBUG oslo_concurrency.processutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmph2nhum0_" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.592 187132 DEBUG oslo_concurrency.processutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcozz2ax9" returned: 0 in 0.139s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:04:59 np0005554845 kernel: tun: Universal TUN/TAP device driver, 1.6
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.6222] manager: (tap6b225150-80): new Tun device (/org/freedesktop/NetworkManager/Devices/23)
Dec 11 01:04:59 np0005554845 kernel: tap6b225150-80: entered promiscuous mode
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00027|binding|INFO|Claiming lport 6b225150-8014-4488-91e5-7faf65ace151 for this chassis.
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00028|binding|INFO|6b225150-8014-4488-91e5-7faf65ace151: Claiming fa:16:3e:8c:57:fd 10.100.0.5
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.628 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.636 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.6417] manager: (tapf7e89a08-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/24)
Dec 11 01:04:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:59.650 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:57:fd 10.100.0.5'], port_security=['fa:16:3e:8c:57:fd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-869c578a-42b0-4a82-a564-a3681a196ad7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dbc12d2a-513a-45e0-9da7-c3b6cdd3e2e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=72246518-7492-4032-b5a8-4189af5b12a8, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=6b225150-8014-4488-91e5-7faf65ace151) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:04:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:59.651 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 6b225150-8014-4488-91e5-7faf65ace151 in datapath 869c578a-42b0-4a82-a564-a3681a196ad7 bound to our chassis#033[00m
Dec 11 01:04:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:59.653 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 869c578a-42b0-4a82-a564-a3681a196ad7#033[00m
Dec 11 01:04:59 np0005554845 systemd-udevd[213850]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:04:59 np0005554845 systemd-udevd[213851]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.6921] device (tap6b225150-80): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.6929] device (tap6b225150-80): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:04:59 np0005554845 systemd-machined[153381]: New machine qemu-1-instance-00000001.
Dec 11 01:04:59 np0005554845 kernel: tapf7e89a08-eb: entered promiscuous mode
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.7155] device (tapf7e89a08-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:04:59 np0005554845 NetworkManager[55529]: <info>  [1765433099.7162] device (tapf7e89a08-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00029|binding|INFO|Claiming lport f7e89a08-ebcf-4928-85e4-e649df5a3196 for this chassis.
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00030|binding|INFO|f7e89a08-ebcf-4928-85e4-e649df5a3196: Claiming fa:16:3e:5f:c0:6b 10.100.0.10
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.719 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:59 np0005554845 systemd[1]: Started Virtual Machine qemu-1-instance-00000001.
Dec 11 01:04:59 np0005554845 systemd-machined[153381]: New machine qemu-2-instance-00000002.
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00031|binding|INFO|Setting lport 6b225150-8014-4488-91e5-7faf65ace151 ovn-installed in OVS
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.726 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00032|binding|INFO|Setting lport 6b225150-8014-4488-91e5-7faf65ace151 up in Southbound
Dec 11 01:04:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:04:59.732 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:c0:6b 10.100.0.10'], port_security=['fa:16:3e:5f:c0:6b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'ba02ffb9-9586-46dd-b538-cc6860e13640', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaab7f8a-7d81-44bb-8e94-37cb560116c2, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=f7e89a08-ebcf-4928-85e4-e649df5a3196) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.735 187132 DEBUG nova.network.neutron [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updated VIF entry in instance network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.736 187132 DEBUG nova.network.neutron [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:04:59 np0005554845 systemd[1]: Started Virtual Machine qemu-2-instance-00000002.
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.763 187132 DEBUG oslo_concurrency.lockutils [req-d8233926-e1f5-4a03-bcf2-6dc3c6af06b3 req-a3dc503b-f505-4611-a176-b59a8fb9329a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.770 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00033|binding|INFO|Setting lport f7e89a08-ebcf-4928-85e4-e649df5a3196 ovn-installed in OVS
Dec 11 01:04:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:04:59Z|00034|binding|INFO|Setting lport f7e89a08-ebcf-4928-85e4-e649df5a3196 up in Southbound
Dec 11 01:04:59 np0005554845 nova_compute[187128]: 2025-12-11 06:04:59.774 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.061 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433100.0607553, 4b205ab7-6622-4644-a404-ec948480d1ba => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.063 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] VM Started (Lifecycle Event)#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.107 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.111 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433100.0608592, 4b205ab7-6622-4644-a404-ec948480d1ba => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.112 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.123 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6d91b070-829e-4106-a53e-2f073f656270]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.124 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap869c578a-41 in ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.126 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap869c578a-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.127 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c001cd0f-4883-4f64-abb6-ff7990dd4cd4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.127 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7e08837d-d29e-4628-b255-1b7f1326723c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.149 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.149 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[eb5b4af8-72fc-4894-8b77-b5af6896890d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.152 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.171 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.171 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433100.133248, f2c66e64-57a7-4e97-8552-80a9d24397f6 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.171 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] VM Started (Lifecycle Event)#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.174 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[721c2f28-c85f-496b-90f1-eb0b16003cfc]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.175 104320 INFO oslo.privsep.daemon [-] Running privsep helper: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'privsep-helper', '--config-file', '/etc/neutron/neutron.conf', '--config-dir', '/etc/neutron.conf.d', '--privsep_context', 'neutron.privileged.link_cmd', '--privsep_sock_path', '/tmp/tmpm0shl634/privsep.sock']#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.187 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.190 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433100.133321, f2c66e64-57a7-4e97-8552-80a9d24397f6 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.190 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.205 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.208 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.229 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.845 187132 DEBUG nova.network.neutron [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updated VIF entry in instance network info cache for port 6b225150-8014-4488-91e5-7faf65ace151. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.846 187132 DEBUG nova.network.neutron [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.861 104320 INFO oslo.privsep.daemon [-] Spawned new privsep daemon via rootwrap#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.862 104320 DEBUG oslo.privsep.daemon [-] Accepted privsep connection to /tmp/tmpm0shl634/privsep.sock __init__ /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:362#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.744 213899 INFO oslo.privsep.daemon [-] privsep daemon starting#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.750 213899 INFO oslo.privsep.daemon [-] privsep process running with uid/gid: 0/0#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.752 213899 INFO oslo.privsep.daemon [-] privsep process running with capabilities (eff/prm/inh): CAP_NET_ADMIN|CAP_SYS_ADMIN/CAP_NET_ADMIN|CAP_SYS_ADMIN/none#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.752 213899 INFO oslo.privsep.daemon [-] privsep daemon running as pid 213899#033[00m
Dec 11 01:05:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:00.865 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[8b190887-2d39-473f-b84c-1cc1b58931a0]: (2,) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:00 np0005554845 nova_compute[187128]: 2025-12-11 06:05:00.866 187132 DEBUG oslo_concurrency.lockutils [req-407b60e1-69b9-4a8c-85d5-8c427cbc74e3 req-a7c6e9bb-68ef-46e4-87b9-b0e73b08cd0e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:01 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:01.397 213899 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "context-manager" by "neutron_lib.db.api._create_context_manager" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:01 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:01.398 213899 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" acquired by "neutron_lib.db.api._create_context_manager" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:01 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:01.398 213899 DEBUG oslo_concurrency.lockutils [-] Lock "context-manager" "released" by "neutron_lib.db.api._create_context_manager" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:01 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:01.971 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f96dbfef-0f6f-4d8a-8ebe-4933801943d6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.738 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b3a17e0d-698d-46ff-ae99-7e30c99ac4e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 NetworkManager[55529]: <info>  [1765433102.7408] manager: (tap869c578a-40): new Veth device (/org/freedesktop/NetworkManager/Devices/25)
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.771 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[c08166d4-4f2c-4cca-b7e4-01198ad24c7d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.773 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[24fc71f0-6ef0-4b07-a743-ce6da12834c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 systemd-udevd[213922]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:05:02 np0005554845 NetworkManager[55529]: <info>  [1765433102.7970] device (tap869c578a-40): carrier: link connected
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.800 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[ef79a0d6-e1cc-43a7-a6c8-bd3163b49500]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 podman[213908]: 2025-12-11 06:05:02.813555922 +0000 UTC m=+0.060426786 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.821 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0c219c80-2d19-483a-9a59-ad972baa6eee]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap869c578a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:9d:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337744, 'reachable_time': 30169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 213947, 'error': None, 'target': 'ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.837 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6738571b-3e6d-4992-b67b-fe99ca3f4037]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fefd:9d71'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337744, 'tstamp': 337744}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 213953, 'error': None, 'target': 'ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 nova_compute[187128]: 2025-12-11 06:05:02.844 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.852 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ec8f357b-2e00-41ad-b35e-f419bc745bcd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap869c578a-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:fd:9d:71'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 13], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337744, 'reachable_time': 30169, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 213954, 'error': None, 'target': 'ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.878 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[08dccb9d-7c52-4adf-a085-cd62e7dc8901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.936 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8d814bd2-1f64-414a-92ef-6c19545d14b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.938 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap869c578a-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.939 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.939 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap869c578a-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:02 np0005554845 nova_compute[187128]: 2025-12-11 06:05:02.941 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:02 np0005554845 kernel: tap869c578a-40: entered promiscuous mode
Dec 11 01:05:02 np0005554845 NetworkManager[55529]: <info>  [1765433102.9440] manager: (tap869c578a-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/26)
Dec 11 01:05:02 np0005554845 nova_compute[187128]: 2025-12-11 06:05:02.944 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.946 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap869c578a-40, col_values=(('external_ids', {'iface-id': 'c8ffeef2-7a6e-414a-8ca6-6cf7e6bf2700'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:02Z|00035|binding|INFO|Releasing lport c8ffeef2-7a6e-414a-8ca6-6cf7e6bf2700 from this chassis (sb_readonly=0)
Dec 11 01:05:02 np0005554845 nova_compute[187128]: 2025-12-11 06:05:02.947 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:02 np0005554845 nova_compute[187128]: 2025-12-11 06:05:02.972 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.973 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/869c578a-42b0-4a82-a564-a3681a196ad7.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/869c578a-42b0-4a82-a564-a3681a196ad7.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.974 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7f7f6c56-a866-4c98-94e7-a4cd63d57405]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.976 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-869c578a-42b0-4a82-a564-a3681a196ad7
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/869c578a-42b0-4a82-a564-a3681a196ad7.pid.haproxy
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 869c578a-42b0-4a82-a564-a3681a196ad7
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:05:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:02.976 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7', 'env', 'PROCESS_TAG=haproxy-869c578a-42b0-4a82-a564-a3681a196ad7', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/869c578a-42b0-4a82-a564-a3681a196ad7.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:05:03 np0005554845 podman[213987]: 2025-12-11 06:05:03.407767834 +0000 UTC m=+0.068893837 container create 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:05:03 np0005554845 systemd[1]: Started libpod-conmon-1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165.scope.
Dec 11 01:05:03 np0005554845 podman[213987]: 2025-12-11 06:05:03.368145001 +0000 UTC m=+0.029271014 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:05:03 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:05:03 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/0c1f5029eaa9687fb645cb97918d9fa85c2f210e49fb62adeb9a75d621d9890a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:05:03 np0005554845 podman[213987]: 2025-12-11 06:05:03.485565448 +0000 UTC m=+0.146691541 container init 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:05:03 np0005554845 podman[213987]: 2025-12-11 06:05:03.495651521 +0000 UTC m=+0.156777554 container start 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202)
Dec 11 01:05:03 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [NOTICE]   (214006) : New worker (214008) forked
Dec 11 01:05:03 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [NOTICE]   (214006) : Loading success.
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.566 104320 INFO neutron.agent.ovn.metadata.agent [-] Port f7e89a08-ebcf-4928-85e4-e649df5a3196 in datapath af86bfb7-241f-4a6e-8237-9d9593dd5fa4 unbound from our chassis#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.569 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network af86bfb7-241f-4a6e-8237-9d9593dd5fa4#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.582 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a1e2f009-bfaa-4d5c-954a-11fd2aacc66d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.583 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapaf86bfb7-21 in ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.585 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapaf86bfb7-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.585 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6cd0a674-a636-48f8-b7f9-1057bbfb94bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.587 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b92220f5-8994-426e-bb24-ca9c6229b731]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.614 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a91ae4-7bea-40c8-a1d8-e9ccf3959fb4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.632 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[67c76966-b69a-482b-871b-431660c0d948]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.667 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[103c1aae-71e8-4028-9da5-197a6c660cd0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 systemd-udevd[213930]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:05:03 np0005554845 NetworkManager[55529]: <info>  [1765433103.6797] manager: (tapaf86bfb7-20): new Veth device (/org/freedesktop/NetworkManager/Devices/27)
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.679 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a9961754-7abb-43c4-bd96-c9695fac166d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.723 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[6a09f2a5-12f8-4013-9545-61a72703aa57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.728 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[08e35a1a-6c86-4f51-a7d2-e63dbedc425f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 NetworkManager[55529]: <info>  [1765433103.7653] device (tapaf86bfb7-20): carrier: link connected
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.773 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[9df1e7a2-7162-4674-bc75-7d2162f4d31a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.803 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[239749b9-c995-4f08-b5c8-db66a18e21cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf86bfb7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:c6:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337841, 'reachable_time': 39418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214028, 'error': None, 'target': 'ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.820 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[41b26894-9fe6-4b0f-9ed8-7de0318e224e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe88:c674'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 337841, 'tstamp': 337841}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214029, 'error': None, 'target': 'ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.837 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ddee71b4-c032-430c-9f67-b11ddaa78c41]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapaf86bfb7-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:88:c6:74'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 14], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337841, 'reachable_time': 39418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214030, 'error': None, 'target': 'ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.871 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b28b93f5-8d65-4d95-89f2-2f52035bd0b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.953 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d600b490-71c9-42b3-a544-f3c30675dd63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.954 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf86bfb7-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.955 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.955 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaf86bfb7-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:03 np0005554845 NetworkManager[55529]: <info>  [1765433103.9579] manager: (tapaf86bfb7-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/28)
Dec 11 01:05:03 np0005554845 nova_compute[187128]: 2025-12-11 06:05:03.957 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:03 np0005554845 kernel: tapaf86bfb7-20: entered promiscuous mode
Dec 11 01:05:03 np0005554845 nova_compute[187128]: 2025-12-11 06:05:03.960 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.962 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapaf86bfb7-20, col_values=(('external_ids', {'iface-id': '987ee34a-a0fa-428c-8f69-b38de389e7ea'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:03 np0005554845 nova_compute[187128]: 2025-12-11 06:05:03.963 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:03 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:03Z|00036|binding|INFO|Releasing lport 987ee34a-a0fa-428c-8f69-b38de389e7ea from this chassis (sb_readonly=0)
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.966 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/af86bfb7-241f-4a6e-8237-9d9593dd5fa4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/af86bfb7-241f-4a6e-8237-9d9593dd5fa4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.967 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[aa697768-8f9b-4a91-8660-01ea8d2825e4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.967 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-af86bfb7-241f-4a6e-8237-9d9593dd5fa4
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/af86bfb7-241f-4a6e-8237-9d9593dd5fa4.pid.haproxy
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID af86bfb7-241f-4a6e-8237-9d9593dd5fa4
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:05:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:03.968 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'env', 'PROCESS_TAG=haproxy-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/af86bfb7-241f-4a6e-8237-9d9593dd5fa4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:05:03 np0005554845 nova_compute[187128]: 2025-12-11 06:05:03.977 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:04 np0005554845 podman[214061]: 2025-12-11 06:05:04.448123709 +0000 UTC m=+0.097866120 container create a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:05:04 np0005554845 podman[214061]: 2025-12-11 06:05:04.371007992 +0000 UTC m=+0.020750433 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:05:04 np0005554845 systemd[1]: Started libpod-conmon-a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5.scope.
Dec 11 01:05:04 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:05:04 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/b74cd6c3b033207cdcef29c8878175a5ee8d3376baf441b018a337ae21ae52da/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:05:04 np0005554845 podman[214061]: 2025-12-11 06:05:04.521383901 +0000 UTC m=+0.171126332 container init a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 01:05:04 np0005554845 podman[214061]: 2025-12-11 06:05:04.525963835 +0000 UTC m=+0.175706246 container start a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:05:04 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [NOTICE]   (214080) : New worker (214082) forked
Dec 11 01:05:04 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [NOTICE]   (214080) : Loading success.
Dec 11 01:05:04 np0005554845 nova_compute[187128]: 2025-12-11 06:05:04.773 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:07 np0005554845 nova_compute[187128]: 2025-12-11 06:05:07.903 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:08 np0005554845 podman[214091]: 2025-12-11 06:05:08.126402253 +0000 UTC m=+0.057617745 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:05:09 np0005554845 nova_compute[187128]: 2025-12-11 06:05:09.774 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:10 np0005554845 podman[214112]: 2025-12-11 06:05:10.158741352 +0000 UTC m=+0.093140739 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:05:10 np0005554845 podman[214113]: 2025-12-11 06:05:10.164339174 +0000 UTC m=+0.097321592 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 11 01:05:10 np0005554845 nova_compute[187128]: 2025-12-11 06:05:10.999 187132 DEBUG nova.compute.manager [req-5d0d64b8-caf3-49e0-ac27-f0b495cab2eb req-ab771731-a5e9-45ce-ba3f-2a1af0ba4eb4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:10.999 187132 DEBUG oslo_concurrency.lockutils [req-5d0d64b8-caf3-49e0-ac27-f0b495cab2eb req-ab771731-a5e9-45ce-ba3f-2a1af0ba4eb4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.000 187132 DEBUG oslo_concurrency.lockutils [req-5d0d64b8-caf3-49e0-ac27-f0b495cab2eb req-ab771731-a5e9-45ce-ba3f-2a1af0ba4eb4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.000 187132 DEBUG oslo_concurrency.lockutils [req-5d0d64b8-caf3-49e0-ac27-f0b495cab2eb req-ab771731-a5e9-45ce-ba3f-2a1af0ba4eb4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.000 187132 DEBUG nova.compute.manager [req-5d0d64b8-caf3-49e0-ac27-f0b495cab2eb req-ab771731-a5e9-45ce-ba3f-2a1af0ba4eb4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Processing event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.001 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Instance event wait completed in 10 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.005 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433111.0052364, f2c66e64-57a7-4e97-8552-80a9d24397f6 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.005 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.007 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.016 187132 INFO nova.virt.libvirt.driver [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Instance spawned successfully.#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.017 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.035 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.042 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.046 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.047 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.048 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.048 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.049 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.049 187132 DEBUG nova.virt.libvirt.driver [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.088 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.153 187132 INFO nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Took 22.22 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.154 187132 DEBUG nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.234 187132 INFO nova.compute.manager [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Took 22.67 seconds to build instance.#033[00m
Dec 11 01:05:11 np0005554845 nova_compute[187128]: 2025-12-11 06:05:11.373 187132 DEBUG oslo_concurrency.lockutils [None req-55ad2dee-9194-4b1a-bc8e-ad86f00100d6 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 22.930s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:12 np0005554845 nova_compute[187128]: 2025-12-11 06:05:12.908 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:13 np0005554845 podman[214158]: 2025-12-11 06:05:13.164225958 +0000 UTC m=+0.088670998 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.694 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.695 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.695 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.696 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.696 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] No waiting events found dispatching network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.696 187132 WARNING nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received unexpected event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.697 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.697 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.697 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.698 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.698 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Processing event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.698 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.699 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.701 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.701 187132 DEBUG oslo_concurrency.lockutils [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.702 187132 DEBUG nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] No waiting events found dispatching network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.702 187132 WARNING nova.compute.manager [req-f198c788-f213-454a-82ae-189fc166e140 req-2cd76976-9447-452c-97d3-ec81ae215a85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received unexpected event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.704 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance event wait completed in 13 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.709 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.710 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433113.7087035, 4b205ab7-6622-4644-a404-ec948480d1ba => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.710 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.715 187132 INFO nova.virt.libvirt.driver [-] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance spawned successfully.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.716 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.734 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.737 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.750 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.750 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.751 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.752 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.753 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.754 187132 DEBUG nova.virt.libvirt.driver [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.761 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.843 187132 INFO nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Took 24.04 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.844 187132 DEBUG nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.921 187132 INFO nova.compute.manager [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Took 24.63 seconds to build instance.#033[00m
Dec 11 01:05:13 np0005554845 nova_compute[187128]: 2025-12-11 06:05:13.943 187132 DEBUG oslo_concurrency.lockutils [None req-629bcbee-1122-4205-bdbc-3130c4b74347 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 24.743s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:14 np0005554845 nova_compute[187128]: 2025-12-11 06:05:14.775 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:15 np0005554845 nova_compute[187128]: 2025-12-11 06:05:15.365 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3667] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/29)
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3676] device (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <warn>  [1765433115.3678] device (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3690] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Interface device (/org/freedesktop/NetworkManager/Devices/30)
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3695] device (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int)[Open vSwitch Interface]: state change: unmanaged -> unavailable (reason 'managed', managed-type: 'external')
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <warn>  [1765433115.3696] device (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int)[Open vSwitch Interface]: error setting IPv4 forwarding to '1': No such file or directory
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3706] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/31)
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3741] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/32)
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3749] device (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 01:05:15 np0005554845 NetworkManager[55529]: <info>  [1765433115.3754] device (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int)[Open vSwitch Interface]: state change: unavailable -> disconnected (reason 'none', managed-type: 'full')
Dec 11 01:05:15 np0005554845 nova_compute[187128]: 2025-12-11 06:05:15.664 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:15 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:15Z|00037|binding|INFO|Releasing lport c8ffeef2-7a6e-414a-8ca6-6cf7e6bf2700 from this chassis (sb_readonly=0)
Dec 11 01:05:15 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:15Z|00038|binding|INFO|Releasing lport 987ee34a-a0fa-428c-8f69-b38de389e7ea from this chassis (sb_readonly=0)
Dec 11 01:05:15 np0005554845 nova_compute[187128]: 2025-12-11 06:05:15.711 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.392 187132 DEBUG nova.compute.manager [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-changed-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.394 187132 DEBUG nova.compute.manager [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing instance network info cache due to event network-changed-6b225150-8014-4488-91e5-7faf65ace151. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.395 187132 DEBUG oslo_concurrency.lockutils [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.395 187132 DEBUG oslo_concurrency.lockutils [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:17.395 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=4, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=3) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.396 187132 DEBUG nova.network.neutron [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing network info cache for port 6b225150-8014-4488-91e5-7faf65ace151 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:05:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:17.398 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.402 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:17 np0005554845 nova_compute[187128]: 2025-12-11 06:05:17.943 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:18 np0005554845 nova_compute[187128]: 2025-12-11 06:05:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:18 np0005554845 nova_compute[187128]: 2025-12-11 06:05:18.943 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:18 np0005554845 nova_compute[187128]: 2025-12-11 06:05:18.945 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:18 np0005554845 nova_compute[187128]: 2025-12-11 06:05:18.946 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:18 np0005554845 nova_compute[187128]: 2025-12-11 06:05:18.946 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:05:19 np0005554845 podman[214182]: 2025-12-11 06:05:19.095761173 +0000 UTC m=+0.066840426 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:05:19 np0005554845 podman[214183]: 2025-12-11 06:05:19.109626509 +0000 UTC m=+0.083840597 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.openshift.expose-services=, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., architecture=x86_64, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Dec 11 01:05:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:19.402 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '4'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.667 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.725 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.726 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.747 187132 DEBUG nova.network.neutron [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updated VIF entry in instance network info cache for port 6b225150-8014-4488-91e5-7faf65ace151. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.749 187132 DEBUG nova.network.neutron [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.776 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.785 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.789 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.852 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.853 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.920 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:19 np0005554845 nova_compute[187128]: 2025-12-11 06:05:19.958 187132 DEBUG oslo_concurrency.lockutils [req-b9b7e7a6-387a-49e8-bfed-0ae224c564fd req-4f98b3ad-1b90-4b04-bfe2-76f01c3fa7dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.200 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.202 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5502MB free_disk=73.32953262329102GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.203 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.203 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.783 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 4b205ab7-6622-4644-a404-ec948480d1ba actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.783 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance f2c66e64-57a7-4e97-8552-80a9d24397f6 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.783 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.784 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:05:20 np0005554845 nova_compute[187128]: 2025-12-11 06:05:20.836 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.141 187132 ERROR nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [req-529b293c-6b04-4aec-9786-e440aa38c0b3] Failed to update inventory to [{'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}}] for resource provider with UUID eece7817-9d4f-4ebe-96c8-a659f76170f9.  Got 409: {"errors": [{"status": 409, "title": "Conflict", "detail": "There was a conflict when trying to complete your request.\n\n resource provider generation conflict  ", "code": "placement.concurrent_update", "request_id": "req-529b293c-6b04-4aec-9786-e440aa38c0b3"}]}#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.198 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.221 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.221 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 0, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.234 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.258 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.326 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.397 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updated inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with generation 4 in Placement from set_inventory_for_provider using data: {'MEMORY_MB': {'total': 7679, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0, 'reserved': 512}, 'VCPU': {'total': 8, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0, 'reserved': 0}, 'DISK_GB': {'total': 79, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9, 'reserved': 1}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:957#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.398 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 generation from 4 to 5 during operation: update_inventory _update_generation /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:164#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.398 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.423 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:05:21 np0005554845 nova_compute[187128]: 2025-12-11 06:05:21.424 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 1.221s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:21 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:21Z|00039|memory|INFO|peak resident set size grew 64% in last 928.4 seconds, from 16128 kB to 26496 kB
Dec 11 01:05:21 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:21Z|00040|memory|INFO|idl-cells-OVN_Southbound:13761 idl-cells-Open_vSwitch:927 if_status_mgr_ifaces_state_usage-KB:1 if_status_mgr_ifaces_usage-KB:1 lflow-cache-entries-cache-expr:500 lflow-cache-entries-cache-matches:331 lflow-cache-size-KB:2106 local_datapath_usage-KB:4 ofctrl_desired_flow_usage-KB:867 ofctrl_installed_flow_usage-KB:635 ofctrl_sb_flow_ref_usage-KB:321
Dec 11 01:05:22 np0005554845 nova_compute[187128]: 2025-12-11 06:05:22.947 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:22Z|00004|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:8c:57:fd 10.100.0.5
Dec 11 01:05:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:22Z|00005|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:8c:57:fd 10.100.0.5
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.419 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.419 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.419 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.419 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.603 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.603 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.603 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:05:23 np0005554845 nova_compute[187128]: 2025-12-11 06:05:23.604 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid f2c66e64-57a7-4e97-8552-80a9d24397f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:05:24 np0005554845 nova_compute[187128]: 2025-12-11 06:05:24.810 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:25 np0005554845 nova_compute[187128]: 2025-12-11 06:05:25.745 187132 DEBUG nova.compute.manager [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:25 np0005554845 nova_compute[187128]: 2025-12-11 06:05:25.745 187132 DEBUG nova.compute.manager [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing instance network info cache due to event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:05:25 np0005554845 nova_compute[187128]: 2025-12-11 06:05:25.745 187132 DEBUG oslo_concurrency.lockutils [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:25 np0005554845 nova_compute[187128]: 2025-12-11 06:05:25.745 187132 DEBUG oslo_concurrency.lockutils [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:25 np0005554845 nova_compute[187128]: 2025-12-11 06:05:25.746 187132 DEBUG nova.network.neutron [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.140 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.154 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.154 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.154 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:05:26 np0005554845 nova_compute[187128]: 2025-12-11 06:05:26.155 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:05:26 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:26Z|00006|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5f:c0:6b 10.100.0.10
Dec 11 01:05:26 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:26Z|00007|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5f:c0:6b 10.100.0.10
Dec 11 01:05:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:26.218 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:26.219 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:26.219 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:27 np0005554845 nova_compute[187128]: 2025-12-11 06:05:27.296 187132 DEBUG nova.network.neutron [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updated VIF entry in instance network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:05:27 np0005554845 nova_compute[187128]: 2025-12-11 06:05:27.297 187132 DEBUG nova.network.neutron [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:27 np0005554845 nova_compute[187128]: 2025-12-11 06:05:27.324 187132 DEBUG oslo_concurrency.lockutils [req-727c36be-5ee0-48bd-901e-8cd67e07c467 req-961b482f-5714-4720-9d5e-9754956bf5e2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:27 np0005554845 nova_compute[187128]: 2025-12-11 06:05:27.950 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:29 np0005554845 nova_compute[187128]: 2025-12-11 06:05:29.210 187132 INFO nova.compute.manager [None req-c0479412-2319-494f-81aa-a85063d4e959 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Get console output#033[00m
Dec 11 01:05:29 np0005554845 nova_compute[187128]: 2025-12-11 06:05:29.357 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:05:29 np0005554845 nova_compute[187128]: 2025-12-11 06:05:29.812 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:30.474 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}7161bc1119cf5a517610464cbb38d804df069fe8691b5aa1494675c31852b41b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 11 01:05:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:30.967 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 644 Content-Type: application/json Date: Thu, 11 Dec 2025 06:05:30 GMT Keep-Alive: timeout=5, max=100 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-1ef16ff5-2f1a-4399-825c-f8b9a8c3bd53 x-openstack-request-id: req-1ef16ff5-2f1a-4399-825c-f8b9a8c3bd53 _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 11 01:05:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:30.968 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavors": [{"id": "604ddafe-0c56-4202-93c6-01236db9ae98", "name": "m1.nano", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/604ddafe-0c56-4202-93c6-01236db9ae98"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/604ddafe-0c56-4202-93c6-01236db9ae98"}]}, {"id": "94d6aa62-424e-47aa-9ff5-97c0a3c9af5e", "name": "m1.micro", "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/94d6aa62-424e-47aa-9ff5-97c0a3c9af5e"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/94d6aa62-424e-47aa-9ff5-97c0a3c9af5e"}]}]} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 11 01:05:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:30.968 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors?is_public=None used request id req-1ef16ff5-2f1a-4399-825c-f8b9a8c3bd53 request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 11 01:05:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:30.971 12 DEBUG novaclient.v2.client [-] REQ: curl -g -i -X GET https://nova-internal.openstack.svc:8774/v2.1/flavors/604ddafe-0c56-4202-93c6-01236db9ae98 -H "Accept: application/json" -H "User-Agent: python-novaclient" -H "X-Auth-Token: {SHA256}7161bc1119cf5a517610464cbb38d804df069fe8691b5aa1494675c31852b41b" -H "X-OpenStack-Nova-API-Version: 2.1" _http_log_request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:519
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.127 12 DEBUG novaclient.v2.client [-] RESP: [200] Connection: Keep-Alive Content-Length: 495 Content-Type: application/json Date: Thu, 11 Dec 2025 06:05:30 GMT Keep-Alive: timeout=5, max=99 OpenStack-API-Version: compute 2.1 Server: Apache Vary: OpenStack-API-Version,X-OpenStack-Nova-API-Version X-OpenStack-Nova-API-Version: 2.1 x-compute-request-id: req-ae04574c-44ec-4aca-94f9-8f3c3e7a062d x-openstack-request-id: req-ae04574c-44ec-4aca-94f9-8f3c3e7a062d _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:550
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.127 12 DEBUG novaclient.v2.client [-] RESP BODY: {"flavor": {"id": "604ddafe-0c56-4202-93c6-01236db9ae98", "name": "m1.nano", "ram": 128, "disk": 1, "swap": "", "OS-FLV-EXT-DATA:ephemeral": 0, "OS-FLV-DISABLED:disabled": false, "vcpus": 1, "os-flavor-access:is_public": true, "rxtx_factor": 1.0, "links": [{"rel": "self", "href": "https://nova-internal.openstack.svc:8774/v2.1/flavors/604ddafe-0c56-4202-93c6-01236db9ae98"}, {"rel": "bookmark", "href": "https://nova-internal.openstack.svc:8774/flavors/604ddafe-0c56-4202-93c6-01236db9ae98"}]}} _http_log_response /usr/lib/python3.9/site-packages/keystoneauth1/session.py:582
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.128 12 DEBUG novaclient.v2.client [-] GET call to compute for https://nova-internal.openstack.svc:8774/v2.1/flavors/604ddafe-0c56-4202-93c6-01236db9ae98 used request id req-ae04574c-44ec-4aca-94f9-8f3c3e7a062d request /usr/lib/python3.9/site-packages/keystoneauth1/session.py:954
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.129 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'name': 'tempest-TestNetworkBasicOps-server-1021466613', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000001', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': 'fce35ab888e44e46b3108813dcdf4163', 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'hostId': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.135 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000002', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'hostId': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.136 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.141 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for f2c66e64-57a7-4e97-8552-80a9d24397f6 / tap6b225150-80 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.141 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.147 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4b205ab7-6622-4644-a404-ec948480d1ba / tapf7e89a08-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.147 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '422d2e0d-d246-4ff0-9b84-d7435174a793', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.136645', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66aa63e8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '125caca6640fcc528234da99c716dc3ee294cf4a2912be423d540b8559dc256a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.136645', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66ab321e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': 'fec25b20d466d1f7c41c9ffffe288ef349ec6b5227e2c9adf8550b740c669d5d'}]}, 'timestamp': '2025-12-11 06:05:32.148284', '_unique_id': '33a612d235c04194b68d5777709159d5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.157 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.163 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.163 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.164 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ce85b136-2a0b-4288-b19b-ab842a64ea4d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.163561', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66adab98-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': 'd2ceb6319cd4d8f0f9b991330fc4919efa00b8de170903985a4c12636b6f8627'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.163561', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66adcf42-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': 'a73344fade9ef3b35c06b9e99a5cdabfb7351d9b02881525c49294aadad1fcd2'}]}, 'timestamp': '2025-12-11 06:05:32.165558', '_unique_id': '760a53b44e724c3e997cb80a733d76a8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.169 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.170 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.outgoing.packets volume: 106 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.170 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14bc707b-0de8-4bfb-bf38-b53682d8ce8d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 106, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.170122', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66aea8d6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '46d08aec41c288749c9c5aa861927e852be2a83f4e633a67ddab74593ca2690a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.170122', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66aec9c4-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': 'c533d7e64d03b10ef69b798a15c0655928726ac6e63efafe6a2ffce28fdc380e'}]}, 'timestamp': '2025-12-11 06:05:32.171921', '_unique_id': '73472ba949284e4eb1cfe93c7a89dbc9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.175 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.175 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.outgoing.bytes volume: 15816 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.175 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c7573581-dc82-49ed-adde-9f85e723aac4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 15816, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.175231', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66af68b6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': 'beca5b4a1b101d878ac08b387cbb38e760fefcbfa553b959ec97cb1b40705e2d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.175231', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66af7d74-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '3d7c516c3c6db7857f9f4fe0026beb12d03a0ea04105c507ffb999f6b494fd57'}]}, 'timestamp': '2025-12-11 06:05:32.176460', '_unique_id': '84d53bb65de7422792b84bf332e1ee3f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.179 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.197 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.198 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.210 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.211 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8742e1ea-a761-48bd-8b63-b4ab33f2f3b7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.179182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66b2ca7e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': 'b1cb7044fbb620eae5e2764e079ed1589006552c75c144973cbd3fdc22c7ecd2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.179182', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66b2df00-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': '2decd681f81228aeaea23ba76817cd9cfb0b3214260c93ea2279bcebaff4b43e'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.179182', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66b4c52c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': '86f425ad901685f8525f921ba479e304401a02786d3814b657368d60379cdf44'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.179182', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66b4d774-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': '40fc0d0a3eda4ca017044f2ed93f8811d01247cc845876e349eda8427c4e7eb9'}]}, 'timestamp': '2025-12-11 06:05:32.211401', '_unique_id': '9bf8b2ced3614fc99163a3d919c271df'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.214 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.214 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>]
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.215 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.216 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c5c0bce0-78a9-4d56-8f82-33d3213eb7e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.215638', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66b58ebc-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': 'de306012321bdca1598f5eb7b83e57641382321ea863132b91ccc4698399b407'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.215638', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66b59d30-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '42904c4740c12262b0c81182da00029bbac333a68bc1b66e3240a3076146e263'}]}, 'timestamp': '2025-12-11 06:05:32.216444', '_unique_id': '69a4db65fe22488fb96aff3aab39ff39'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.217 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.250 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.requests volume: 302 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.252 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.282 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.requests volume: 316 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.283 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '87c47604-1302-447d-8a65-a2c90346e7d8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 302, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.218434', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66bb040a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '61d8b38e4b68326bfbdb3398664906f8415a9e24f018d96477780516796a3d14'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.218434', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66bb18e6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '040271a6794e6a87f3b709e56e51d9de5117779b64d491c70df0b186ace5d6bf'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 316, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.218434', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66bfc512-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'cc20ba80e7bfd6ba3f47d82e1266be70c5d34f4ae8faf819b240f2d383c38fbb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.218434', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66bfd2be-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'c3e0ff9a8978f923b99e6d331d8a56e94e432e7944c76ae623357beab0363dff'}]}, 'timestamp': '2025-12-11 06:05:32.283354', '_unique_id': 'faecf87c5bde4710bd24de2258c6fce4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.284 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.285 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.285 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.incoming.bytes volume: 18992 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.285 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.incoming.bytes volume: 1744 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc517514-00f7-4596-8064-7d73477ec19e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 18992, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.285225', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66c02dfe-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '9efad5c0f755c162da5373b66467e2422100451af08d0f62d6a0ec12b22f61c6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1744, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.285225', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66c049b0-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '9f3de204ca90d561fd4444a9c98dbf656b644862e18fde311257beb7f8450e80'}]}, 'timestamp': '2025-12-11 06:05:32.286442', '_unique_id': '08df395b758d4b3d9780f3df09fc648a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.287 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.288 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.288 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.latency volume: 3479492014 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.289 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.289 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.latency volume: 6309024288 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.289 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c1a284b-576c-4731-836a-c44385d8b6c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 3479492014, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.288733', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c0b468-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': 'ac60fffe766bf75e3cc0acea44d9f52db68814c68fd06e64429b0a576a59212a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.288733', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c0c0de-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': 'd7492bb512a23f84459fc77b7ab4ba074fdac0e4f704afeb13cfce9926fb3d7f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6309024288, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.288733', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c0cce6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '4fcad2a84de68b03d5f2d9df76b219ebbeed621a7ec21987d9f02d5624f8f7b8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.288733', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c0daf6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'd9419a04e751abd722bb603039368668ef1b78ddd8cdeef6bd909bcfa9fbc091'}]}, 'timestamp': '2025-12-11 06:05:32.290068', '_unique_id': 'f52cb2bb8fe0486889eed202dc689c18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.291 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.307 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/memory.usage volume: 42.94140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.320 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/memory.usage volume: 40.37890625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e1122646-9333-4535-a3be-ac59097ee57b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.94140625, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'timestamp': '2025-12-11T06:05:32.291972', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '66c38a3a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3407.008607293, 'message_signature': '21c11edd13d1053638e00fefae412a9d704c9b3b60abea7e93e24c8b4891ad3d'}, {'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.37890625, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'timestamp': '2025-12-11T06:05:32.291972', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '66c57f70-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3407.021682568, 'message_signature': '3395f9179dd2d6d4af6411f6653dae8970b7ccd216b64d0e33c1e6e1cc920fba'}]}, 'timestamp': '2025-12-11 06:05:32.320510', '_unique_id': 'e207ac99bb6542c78f960331002d3800'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.321 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.322 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.322 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.322 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88efe239-d558-4af0-8305-8e7fc2aebd88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.322109', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66c5ca98-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '1543f541f64b7c2cd4d7f1a9c3f2656954732a39048b47b935b471e12a94d000'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.322109', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66c5d7fe-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '0bb0fa852cc3a6e0731afde75513ffb1704e23510c5516035e729ff2f7109d2c'}]}, 'timestamp': '2025-12-11 06:05:32.322721', '_unique_id': 'cf5020c3a8b24d7cb2ad71d0acf23095'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.323 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.324 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.324 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>]
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.324 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.324 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.latency volume: 211934395 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.324 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.latency volume: 27151446 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.325 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.latency volume: 169311242 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.325 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.latency volume: 26880589 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2b0051eb-acf1-4c1f-bc99-6e3e3dff44bf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 211934395, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.324439', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c6265a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '3352a2598fc8a62c2cef983d763f60a8c0b2aac4c4a8a272e300ae485158d351'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27151446, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.324439', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c63136-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': 'df7acf1fc830e270db6f06ec0a81224ae48291d843e14057a0606a6f37df226f'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 169311242, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.324439', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c63b22-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '60317df8e4d10e2f126be7cfac41b550646d7a298c1ea3cd38134f1fe2451ac8'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 26880589, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.324439', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c64798-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '0a851070c63df7ccc9e95b5eeb819f50b565825469ade391484040551ed35ef8'}]}, 'timestamp': '2025-12-11 06:05:32.325619', '_unique_id': '779cd43dec014ee09cd66ca1dd7f0b98'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.326 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.327 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.327 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.327 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ff744e35-3cce-4d66-b110-c775247419ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.327125', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66c68da2-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '6b4f865e34ff6bea2e0fc5e76e8ba5936d2f8f6a1d95c0ffecb6e19bf6dd32c3'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.327125', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66c69b8a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '932b2790005cf33ac615ffe4c436a0470a4506f60f171176a69ea0762bd266fa'}]}, 'timestamp': '2025-12-11 06:05:32.327758', '_unique_id': '0aaa7a54e71943448c47edd7248f8798'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.328 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.329 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.329 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.allocation volume: 30023680 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.329 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.329 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.330 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4a2b5ef-897f-4e82-8a36-aa72f954ff67', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30023680, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.329238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c6e14e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': '7ec36f12d2836ea50cb2949999010719fa1c0e463f4b04c19f59470dcd720de0'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.329238', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c6f080-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': 'a098f420e6b64c624c8806fa4d8307798301655bd040df587ec625ef2f02cd68'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.329238', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c6fb7a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': '7972b664bc1c697811b4a6306953047aec1e5ae4574ce79b55a98dda8c995f39'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.329238', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c7053e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': '5794094cd07b75d39187306d62ae4f7587d51e76c95b26ea47b9096c62f7d839'}]}, 'timestamp': '2025-12-11 06:05:32.330471', '_unique_id': '575167af16cb4287b75c22d1afd9ce52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.331 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.332 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.332 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd5a70082-b1c5-404a-98fe-fd69255e96ca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.332056', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66c74eb8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': '898da6bc768581bd44f8e123507d447a730acbc1b39fdf57e4c67610d5be1b53'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.332056', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66c75a5c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': '615a2cff4529b2456f3db647e1247a269eb9357a2c7456726390406eb3a7838c'}]}, 'timestamp': '2025-12-11 06:05:32.332684', '_unique_id': '2ba68b703a8d438b90482b46654d25d6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.333 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>]
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/network.incoming.packets volume: 102 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.334 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9414a41-bbfc-41f9-ba29-3e395026e31a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 102, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'instance-00000001-f2c66e64-57a7-4e97-8552-80a9d24397f6-tap6b225150-80', 'timestamp': '2025-12-11T06:05:32.334586', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'tap6b225150-80', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:8c:57:fd', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap6b225150-80'}, 'message_id': '66c7b38a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.838565207, 'message_signature': 'a327e3faaebb9dcaf47d72d0554e7b71d2fa905ed50be3ae215a31983b23c850'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000002-4b205ab7-6622-4644-a404-ec948480d1ba-tapf7e89a08-eb', 'timestamp': '2025-12-11T06:05:32.334586', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'tapf7e89a08-eb', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:5f:c0:6b', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf7e89a08-eb'}, 'message_id': '66c7be84-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.844830987, 'message_signature': 'f9ccd882691c8950602752dad606a2bc46f7fa59b58a52ebb19ab83990840f49'}]}, 'timestamp': '2025-12-11 06:05:32.335202', '_unique_id': '4095d07e89a54e209bdb919110a6d436'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.335 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.336 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.336 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.requests volume: 1109 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.336 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.337 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.requests volume: 1091 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.337 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b3f8eac-ed59-4661-9a17-961034d788db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1109, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.336667', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c80470-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '8ca627a7daa856ae049b2281a1426bed48debc2cdd0bb3800572675e9e4c7060'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.336667', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c80ec0-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '818b482266020e3fa1699e8e6cf4fbff41f3806e8adebb1f02b451ea4319e389'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1091, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.336667', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c8185c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '5522c6d7daee9a40fedcf55ad66884f44ee97110d38bc504ffbc938fffec5981'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.336667', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c8236a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'ee44c574405959f85650d07c8a3d58a707fa0cd9b83e8665b8814c6783c27437'}]}, 'timestamp': '2025-12-11 06:05:32.337775', '_unique_id': 'f7c18cafc24548988935d6a95acb460a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.338 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.339 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.339 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.bytes volume: 30591488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.339 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.339 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.bytes volume: 30181888 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.340 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '86829b9f-6ac0-4b00-ae9a-1a320e6c4878', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30591488, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.339337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c86bc2-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '0a9b365a23d847a6bb680b904d310ea02fd53c803e15b21fcdd97d7e3e8e7d6d'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.339337', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c875ea-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '9e3184f14ce12e51519b45a6b470d05f886758fbdfdb40c10abd7b8ccd74ea92'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30181888, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.339337', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c88288-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '8328e87c9c2e2ed1fcbcd42dd2a571cc9f3030672e12071c276b6d32707ac336'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.339337', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c88d28-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'd6bc5c803875ec450291c92d44dbcf27ea3f20cc00eebd541f22878c1ba79f64'}]}, 'timestamp': '2025-12-11 06:05:32.340523', '_unique_id': 'a58dd0e629874db68b185ac340fc4f9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.341 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkBasicOps-server-1021466613>, <NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1446310862>]
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/cpu volume: 11470000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.342 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/cpu volume: 11110000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '99d12f46-de63-405e-9d80-d085b944559c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11470000000, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'timestamp': '2025-12-11T06:05:32.342498', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '66c8e660-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3407.008607293, 'message_signature': '72db87596a9e2cc420f8d3c34cd8ceda0aabea06caa65ace979853fe8a99e72e'}, {'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11110000000, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'timestamp': '2025-12-11T06:05:32.342498', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '66c8f088-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3407.021682568, 'message_signature': '022144aaed40a1b09af3583838972ee982b455ad54747bcd05a121c349522555'}]}, 'timestamp': '2025-12-11 06:05:32.343081', '_unique_id': '0fac51d1a6db4680940d9726fb21f694'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.343 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.344 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.344 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.344 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.344 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e82418c-9edb-4057-a6ee-e1ce82d7eed5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.344503', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c934a8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': '264e703994262044d92a4484ff8adde67d4f9a389439f442b9250ba140dab536'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.344503', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c93d22-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.881282007, 'message_signature': 'f4b2996f23621905ba09a425285f6441c5626ede0630caa63e25373cfb860c69'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.344503', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c946a0-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': 'd2d53f925dd4f1948896f60d8ddbe9d0b97bd81b5708cf907565e043f3cb71fa'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.344503', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c94df8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.900138398, 'message_signature': '932befb3f508077aa34b9458de0381064bb43061dc34999c13aec9f0281b60a2'}]}, 'timestamp': '2025-12-11 06:05:32.345401', '_unique_id': 'b918f8b339ce4555a8891510c55acf1a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.345 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.346 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.346 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.346 12 DEBUG ceilometer.compute.pollsters [-] f2c66e64-57a7-4e97-8552-80a9d24397f6/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.346 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.bytes volume: 72892416 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 DEBUG ceilometer.compute.pollsters [-] 4b205ab7-6622-4644-a404-ec948480d1ba/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '124fd873-156a-469b-b053-cf5029ec2acc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-vda', 'timestamp': '2025-12-11T06:05:32.346558', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c98412-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '3a25ea856b9773c84b26c104c7ff47d6f3d25afb746f0f03fdb5c3b46a417ffb'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_name': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_name': None, 'resource_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6-sda', 'timestamp': '2025-12-11T06:05:32.346558', 'resource_metadata': {'display_name': 'tempest-TestNetworkBasicOps-server-1021466613', 'name': 'instance-00000001', 'instance_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'instance_type': 'm1.nano', 'host': '0cae10d6b9378c4e5f442ae2538c0b5dca5290d03f8d29e1339c0eb9', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c98bf6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.920327327, 'message_signature': '0322fa43af2dff3eb61c3a3e4d06fe54f478a99d703a6ae464c9b0bd7c3a54df'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72892416, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-vda', 'timestamp': '2025-12-11T06:05:32.346558', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '66c9965a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': 'f185bcc095ef555620d1ed047de6ced20562e620792603fa7601376cb529e4c8'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '4b205ab7-6622-4644-a404-ec948480d1ba-sda', 'timestamp': '2025-12-11T06:05:32.346558', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1446310862', 'name': 'instance-00000002', 'instance_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '66c99fd8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3406.954092053, 'message_signature': '12ef4542ad38b4c4cdc75f80d0a44a8089f6ab36f92909a964186d02d94d000c'}]}, 'timestamp': '2025-12-11 06:05:32.347488', '_unique_id': '7d54256b420340c3b965b2d009ddeee6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:05:32 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:05:32.347 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:05:32 np0005554845 nova_compute[187128]: 2025-12-11 06:05:32.954 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:33 np0005554845 podman[214270]: 2025-12-11 06:05:33.132219482 +0000 UTC m=+0.051373364 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:05:34 np0005554845 nova_compute[187128]: 2025-12-11 06:05:34.350 187132 INFO nova.compute.manager [None req-cf25f03c-7f8f-4ffc-b8c5-c3ca333a118b 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Get console output#033[00m
Dec 11 01:05:34 np0005554845 nova_compute[187128]: 2025-12-11 06:05:34.357 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:05:34 np0005554845 nova_compute[187128]: 2025-12-11 06:05:34.816 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:37 np0005554845 nova_compute[187128]: 2025-12-11 06:05:37.154 187132 INFO nova.compute.manager [None req-95276a4f-c860-400e-a4fd-1019db7c81cc 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Get console output#033[00m
Dec 11 01:05:37 np0005554845 nova_compute[187128]: 2025-12-11 06:05:37.159 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:05:37 np0005554845 nova_compute[187128]: 2025-12-11 06:05:37.957 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:39 np0005554845 podman[214295]: 2025-12-11 06:05:39.130015928 +0000 UTC m=+0.061055949 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute)
Dec 11 01:05:39 np0005554845 nova_compute[187128]: 2025-12-11 06:05:39.820 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:40 np0005554845 nova_compute[187128]: 2025-12-11 06:05:40.645 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:40 np0005554845 nova_compute[187128]: 2025-12-11 06:05:40.645 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:40 np0005554845 nova_compute[187128]: 2025-12-11 06:05:40.645 187132 DEBUG nova.network.neutron [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:05:41 np0005554845 podman[214315]: 2025-12-11 06:05:41.150262018 +0000 UTC m=+0.076913189 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible)
Dec 11 01:05:41 np0005554845 podman[214316]: 2025-12-11 06:05:41.229108228 +0000 UTC m=+0.149478569 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 01:05:42 np0005554845 nova_compute[187128]: 2025-12-11 06:05:42.437 187132 DEBUG nova.network.neutron [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:42 np0005554845 nova_compute[187128]: 2025-12-11 06:05:42.521 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:42 np0005554845 nova_compute[187128]: 2025-12-11 06:05:42.722 187132 DEBUG nova.virt.libvirt.driver [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Dec 11 01:05:42 np0005554845 nova_compute[187128]: 2025-12-11 06:05:42.723 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Creating file /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/9b691ef8c10b431c93ca2e2208fd6164.tmp on remote host 192.168.122.100 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Dec 11 01:05:42 np0005554845 nova_compute[187128]: 2025-12-11 06:05:42.724 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/9b691ef8c10b431c93ca2e2208fd6164.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.067 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.181 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/9b691ef8c10b431c93ca2e2208fd6164.tmp" returned: 1 in 0.457s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.182 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] 'ssh -o BatchMode=yes 192.168.122.100 touch /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/9b691ef8c10b431c93ca2e2208fd6164.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.182 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Creating directory /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba on remote host 192.168.122.100 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.182 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.411 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.100 mkdir -p /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba" returned: 0 in 0.228s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:43 np0005554845 nova_compute[187128]: 2025-12-11 06:05:43.415 187132 DEBUG nova.virt.libvirt.driver [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec 11 01:05:44 np0005554845 podman[214361]: 2025-12-11 06:05:44.137313002 +0000 UTC m=+0.064876962 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 01:05:44 np0005554845 nova_compute[187128]: 2025-12-11 06:05:44.821 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.069 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.070 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.100 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.199 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.200 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.209 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.210 187132 INFO nova.compute.claims [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:05:45 np0005554845 kernel: tapf7e89a08-eb (unregistering): left promiscuous mode
Dec 11 01:05:45 np0005554845 NetworkManager[55529]: <info>  [1765433145.7175] device (tapf7e89a08-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:05:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:45Z|00041|binding|INFO|Releasing lport f7e89a08-ebcf-4928-85e4-e649df5a3196 from this chassis (sb_readonly=0)
Dec 11 01:05:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:45Z|00042|binding|INFO|Setting lport f7e89a08-ebcf-4928-85e4-e649df5a3196 down in Southbound
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.727 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:45Z|00043|binding|INFO|Removing iface tapf7e89a08-eb ovn-installed in OVS
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.732 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:45 np0005554845 nova_compute[187128]: 2025-12-11 06:05:45.743 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:45 np0005554845 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Deactivated successfully.
Dec 11 01:05:45 np0005554845 systemd[1]: machine-qemu\x2d2\x2dinstance\x2d00000002.scope: Consumed 13.394s CPU time.
Dec 11 01:05:45 np0005554845 systemd-machined[153381]: Machine qemu-2-instance-00000002 terminated.
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.433 187132 INFO nova.virt.libvirt.driver [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance shutdown successfully after 3 seconds.#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.440 187132 INFO nova.virt.libvirt.driver [-] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Instance destroyed successfully.#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.442 187132 DEBUG nova.virt.libvirt.vif [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:04:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1446310862',display_name='tempest-TestNetworkAdvancedServerOps-server-1446310862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1446310862',id=2,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgM/qAsq/W6pl2lUgKsTugKsHWIU0fM/qatNQRqhtCy4/LE7hhfrHzJklvICL0fI2w3nlVpvfyHHrtL8lBcnC0/CqZJ2+IvhPWma1ca7/i1wyykXdHh2hWxolw5MKjldw==',key_name='tempest-TestNetworkAdvancedServerOps-1888270068',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:05:13Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-kodglh7t',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:05:40Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=4b205ab7-6622-4644-a404-ec948480d1ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1124514744", "vif_mac": "fa:16:3e:5f:c0:6b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.442 187132 DEBUG nova.network.os_vif_util [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Converting VIF {"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1124514744", "vif_mac": "fa:16:3e:5f:c0:6b"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.444 187132 DEBUG nova.network.os_vif_util [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.444 187132 DEBUG os_vif [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.447 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.448 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7e89a08-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.450 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.456 187132 INFO os_vif [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb')#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.461 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.523 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.525 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.577 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk --force-share --output=json" returned: 0 in 0.052s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.580 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk to 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:05:46 np0005554845 nova_compute[187128]: 2025-12-11 06:05:46.580 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.274 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk" returned: 0 in 0.693s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.276 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.config to 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.277 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.config 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.536 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -C -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.config 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.config" returned: 0 in 0.260s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.538 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.info to 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.539 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.info 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:47 np0005554845 nova_compute[187128]: 2025-12-11 06:05:47.763 187132 DEBUG oslo_concurrency.processutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -C -r /var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba_resize/disk.info 192.168.122.100:/var/lib/nova/instances/4b205ab7-6622-4644-a404-ec948480d1ba/disk.info" returned: 0 in 0.223s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:48.870 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5f:c0:6b 10.100.0.10'], port_security=['fa:16:3e:5f:c0:6b 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': '4b205ab7-6622-4644-a404-ec948480d1ba', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'ba02ffb9-9586-46dd-b538-cc6860e13640', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.196'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=eaab7f8a-7d81-44bb-8e94-37cb560116c2, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=f7e89a08-ebcf-4928-85e4-e649df5a3196) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:05:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:48.872 104320 INFO neutron.agent.ovn.metadata.agent [-] Port f7e89a08-ebcf-4928-85e4-e649df5a3196 in datapath af86bfb7-241f-4a6e-8237-9d9593dd5fa4 unbound from our chassis#033[00m
Dec 11 01:05:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:48.875 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network af86bfb7-241f-4a6e-8237-9d9593dd5fa4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:05:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:48.876 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bf627e55-f061-4099-9aae-5cba9044815e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:48.877 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4 namespace which is not needed anymore#033[00m
Dec 11 01:05:48 np0005554845 nova_compute[187128]: 2025-12-11 06:05:48.979 187132 DEBUG neutronclient.v2_0.client [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f7e89a08-ebcf-4928-85e4-e649df5a3196 for host compute-0.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec 11 01:05:49 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [NOTICE]   (214080) : haproxy version is 2.8.14-c23fe91
Dec 11 01:05:49 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [NOTICE]   (214080) : path to executable is /usr/sbin/haproxy
Dec 11 01:05:49 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [WARNING]  (214080) : Exiting Master process...
Dec 11 01:05:49 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [ALERT]    (214080) : Current worker (214082) exited with code 143 (Terminated)
Dec 11 01:05:49 np0005554845 neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4[214076]: [WARNING]  (214080) : All workers exited. Exiting... (0)
Dec 11 01:05:49 np0005554845 systemd[1]: libpod-a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5.scope: Deactivated successfully.
Dec 11 01:05:49 np0005554845 podman[214434]: 2025-12-11 06:05:49.02980279 +0000 UTC m=+0.050565813 container died a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:05:49 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5-userdata-shm.mount: Deactivated successfully.
Dec 11 01:05:49 np0005554845 systemd[1]: var-lib-containers-storage-overlay-b74cd6c3b033207cdcef29c8878175a5ee8d3376baf441b018a337ae21ae52da-merged.mount: Deactivated successfully.
Dec 11 01:05:49 np0005554845 podman[214434]: 2025-12-11 06:05:49.073000523 +0000 UTC m=+0.093763546 container cleanup a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:05:49 np0005554845 systemd[1]: libpod-conmon-a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5.scope: Deactivated successfully.
Dec 11 01:05:49 np0005554845 podman[214465]: 2025-12-11 06:05:49.135096239 +0000 UTC m=+0.042116624 container remove a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.141 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[128ddde9-f329-4ef5-9f0a-88327df38c2a]: (4, ('Thu Dec 11 06:05:48 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4 (a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5)\na14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5\nThu Dec 11 06:05:49 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4 (a14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5)\na14c5d66cbe9186f076254b63f5535ae1f93a60998304df7c68fcfda5090b3f5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.143 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0e45a270-2ffe-45bc-b517-89c8a5009d36]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.144 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaf86bfb7-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.145 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:49 np0005554845 kernel: tapaf86bfb7-20: left promiscuous mode
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.166 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquiring lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.167 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquired lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.171 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.175 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[641dcdd9-e6bc-4758-870e-a1ba1bbfb88e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.177 187132 INFO nova.compute.rpcapi [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Automatically selected compute RPC version 6.2 from minimum service version 66#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.178 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Releasing lock "compute-rpcapi-router" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.195 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2d26ec9e-4eef-4431-a3f5-6e83c2f2f1f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.196 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1d772aa3-8a50-4a14-8346-381a325251dd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.198 187132 DEBUG nova.compute.provider_tree [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.208 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.208 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.208 187132 DEBUG oslo_concurrency.lockutils [None req-97a1faa1-f347-4a90-8071-051bb0f31831 e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.213 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[996000d5-8f14-4c8c-91f6-b154bd188bcb]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337830, 'reachable_time': 23452, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214486, 'error': None, 'target': 'ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 systemd[1]: run-netns-ovnmeta\x2daf86bfb7\x2d241f\x2d4a6e\x2d8237\x2d9d9593dd5fa4.mount: Deactivated successfully.
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.227 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-af86bfb7-241f-4a6e-8237-9d9593dd5fa4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:05:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:49.228 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[771a4355-38d9-47e6-8a0b-6fc5b410eee8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.234 187132 DEBUG nova.scheduler.client.report [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:05:49 np0005554845 podman[214478]: 2025-12-11 06:05:49.256195576 +0000 UTC m=+0.057951173 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 01:05:49 np0005554845 podman[214479]: 2025-12-11 06:05:49.268259674 +0000 UTC m=+0.063209737 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, maintainer=Red Hat, Inc., architecture=x86_64, managed_by=edpm_ansible, vendor=Red Hat, Inc., name=ubi9-minimal, config_id=edpm, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, version=9.6, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.buildah.version=1.33.7, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:05:49 np0005554845 nova_compute[187128]: 2025-12-11 06:05:49.824 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.486 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 5.286s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.487 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.546 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.546 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.570 187132 INFO nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.594 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.688 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.691 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.692 187132 INFO nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Creating image(s)#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.694 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.695 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.697 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.719 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.788 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.790 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.791 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.802 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.873 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.070s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.875 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.898 187132 DEBUG nova.policy [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.912 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.913 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.122s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.913 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.971 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.972 187132 DEBUG nova.virt.disk.api [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Checking if we can resize image /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:05:50 np0005554845 nova_compute[187128]: 2025-12-11 06:05:50.973 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.030 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.033 187132 DEBUG nova.virt.disk.api [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Cannot resize image /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.034 187132 DEBUG nova.objects.instance [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'migration_context' on Instance uuid e73aa485-0628-421b-b10a-b3e54bf3ba4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.059 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.060 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Ensure instance console log exists: /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.060 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.061 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.061 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.271 187132 DEBUG nova.compute.manager [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-unplugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.272 187132 DEBUG oslo_concurrency.lockutils [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.273 187132 DEBUG oslo_concurrency.lockutils [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.274 187132 DEBUG oslo_concurrency.lockutils [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.274 187132 DEBUG nova.compute.manager [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] No waiting events found dispatching network-vif-unplugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.275 187132 WARNING nova.compute.manager [req-ca12bad1-2809-475b-9715-50c6bd47055e req-43fe1a04-3697-453a-83e6-488a2bb4c0bb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received unexpected event network-vif-unplugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 for instance with vm_state active and task_state resize_migrated.#033[00m
Dec 11 01:05:51 np0005554845 nova_compute[187128]: 2025-12-11 06:05:51.451 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:52 np0005554845 nova_compute[187128]: 2025-12-11 06:05:52.370 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Successfully created port: 591b7425-a3b0-4a58-ac44-61a5cc36e8ea _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:05:53 np0005554845 nova_compute[187128]: 2025-12-11 06:05:53.803 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Successfully updated port: 591b7425-a3b0-4a58-ac44-61a5cc36e8ea _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:05:53 np0005554845 nova_compute[187128]: 2025-12-11 06:05:53.818 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:53 np0005554845 nova_compute[187128]: 2025-12-11 06:05:53.819 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquired lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:53 np0005554845 nova_compute[187128]: 2025-12-11 06:05:53.819 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.097 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.826 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.980 187132 DEBUG nova.compute.manager [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.981 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.982 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.983 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.983 187132 DEBUG nova.compute.manager [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] No waiting events found dispatching network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.984 187132 WARNING nova.compute.manager [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received unexpected event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 for instance with vm_state active and task_state resize_finish.#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.984 187132 DEBUG nova.compute.manager [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.985 187132 DEBUG nova.compute.manager [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing instance network info cache due to event network-changed-f7e89a08-ebcf-4928-85e4-e649df5a3196. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.985 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.986 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:54 np0005554845 nova_compute[187128]: 2025-12-11 06:05:54.986 187132 DEBUG nova.network.neutron [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Refreshing network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.329 187132 DEBUG nova.network.neutron [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Updating instance_info_cache with network_info: [{"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.437 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Releasing lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.437 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Instance network_info: |[{"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.442 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Start _get_guest_xml network_info=[{"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.451 187132 WARNING nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.461 187132 DEBUG nova.virt.libvirt.host [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.462 187132 DEBUG nova.virt.libvirt.host [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.465 187132 DEBUG nova.virt.libvirt.host [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.466 187132 DEBUG nova.virt.libvirt.host [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.468 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.468 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.469 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.469 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.469 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.470 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.470 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.470 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.470 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.471 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.471 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.471 187132 DEBUG nova.virt.hardware [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.476 187132 DEBUG nova.virt.libvirt.vif [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:05:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-585701095',display_name='tempest-TestNetworkBasicOps-server-585701095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-585701095',id=7,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1Rc2GufyrylrTFfdnJn20kyoyDmcxP5+G4Aneow04i3QMtbJGJKtceg6bpqLLgR5U8DZ+NzKsYj9xJW/fyhHsuboRgC5gbQ9x1d57RrCzcNL2NTqWUcyRTRJlT/CGC/g==',key_name='tempest-TestNetworkBasicOps-343008503',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-tzltz3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:05:50Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=e73aa485-0628-421b-b10a-b3e54bf3ba4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.476 187132 DEBUG nova.network.os_vif_util [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.477 187132 DEBUG nova.network.os_vif_util [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.479 187132 DEBUG nova.objects.instance [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'pci_devices' on Instance uuid e73aa485-0628-421b-b10a-b3e54bf3ba4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.502 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <uuid>e73aa485-0628-421b-b10a-b3e54bf3ba4a</uuid>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <name>instance-00000007</name>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkBasicOps-server-585701095</nova:name>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:05:55</nova:creationTime>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:user uuid="3b482a000b3e4b5c964be05bad2a0418">tempest-TestNetworkBasicOps-1486719489-project-member</nova:user>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:project uuid="fce35ab888e44e46b3108813dcdf4163">tempest-TestNetworkBasicOps-1486719489</nova:project>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        <nova:port uuid="591b7425-a3b0-4a58-ac44-61a5cc36e8ea">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.24" ipVersion="4"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="serial">e73aa485-0628-421b-b10a-b3e54bf3ba4a</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="uuid">e73aa485-0628-421b-b10a-b3e54bf3ba4a</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.config"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:5c:9a:5a"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <target dev="tap591b7425-a3"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/console.log" append="off"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:05:55 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:05:55 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:05:55 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:05:55 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.504 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Preparing to wait for external event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.504 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.504 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.505 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.505 187132 DEBUG nova.virt.libvirt.vif [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:05:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-585701095',display_name='tempest-TestNetworkBasicOps-server-585701095',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-585701095',id=7,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1Rc2GufyrylrTFfdnJn20kyoyDmcxP5+G4Aneow04i3QMtbJGJKtceg6bpqLLgR5U8DZ+NzKsYj9xJW/fyhHsuboRgC5gbQ9x1d57RrCzcNL2NTqWUcyRTRJlT/CGC/g==',key_name='tempest-TestNetworkBasicOps-343008503',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-tzltz3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:05:50Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=e73aa485-0628-421b-b10a-b3e54bf3ba4a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.506 187132 DEBUG nova.network.os_vif_util [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.506 187132 DEBUG nova.network.os_vif_util [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.507 187132 DEBUG os_vif [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.507 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.508 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.508 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.511 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.511 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap591b7425-a3, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.512 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap591b7425-a3, col_values=(('external_ids', {'iface-id': '591b7425-a3b0-4a58-ac44-61a5cc36e8ea', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:5c:9a:5a', 'vm-uuid': 'e73aa485-0628-421b-b10a-b3e54bf3ba4a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.514 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:05:55 np0005554845 NetworkManager[55529]: <info>  [1765433155.5166] manager: (tap591b7425-a3): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/33)
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.526 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.526 187132 INFO os_vif [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3')#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.588 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.588 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.589 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No VIF found with MAC fa:16:3e:5c:9a:5a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.590 187132 INFO nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Using config drive#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.986 187132 INFO nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Creating config drive at /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.config#033[00m
Dec 11 01:05:55 np0005554845 nova_compute[187128]: 2025-12-11 06:05:55.992 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3k5mzs1 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.127 187132 DEBUG oslo_concurrency.processutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpj3k5mzs1" returned: 0 in 0.135s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:05:56 np0005554845 kernel: tap591b7425-a3: entered promiscuous mode
Dec 11 01:05:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:56Z|00044|binding|INFO|Claiming lport 591b7425-a3b0-4a58-ac44-61a5cc36e8ea for this chassis.
Dec 11 01:05:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:56Z|00045|binding|INFO|591b7425-a3b0-4a58-ac44-61a5cc36e8ea: Claiming fa:16:3e:5c:9a:5a 10.100.0.24
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.189 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.1904] manager: (tap591b7425-a3): new Tun device (/org/freedesktop/NetworkManager/Devices/34)
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.200 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:9a:5a 10.100.0.24'], port_security=['fa:16:3e:5c:9a:5a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'e73aa485-0628-421b-b10a-b3e54bf3ba4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '410af5a4-e21c-40e1-a234-1ddcf5e427bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd273b1b-48f9-4047-b963-1665d0628b77, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=591b7425-a3b0-4a58-ac44-61a5cc36e8ea) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.202 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 591b7425-a3b0-4a58-ac44-61a5cc36e8ea in datapath b2ae3067-f1da-4d23-957c-204bbcc0cd28 bound to our chassis#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.204 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network b2ae3067-f1da-4d23-957c-204bbcc0cd28#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.216 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[472e5292-083d-4003-808c-7fc97aee870e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.216 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapb2ae3067-f1 in ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.218 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapb2ae3067-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.218 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7c08967c-7498-4f65-9e74-96a9a8418b6b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.219 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[598b32d1-24f6-4fcb-b638-4918ce8f91ca]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 systemd-udevd[214563]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.223 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:56Z|00046|binding|INFO|Setting lport 591b7425-a3b0-4a58-ac44-61a5cc36e8ea ovn-installed in OVS
Dec 11 01:05:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:56Z|00047|binding|INFO|Setting lport 591b7425-a3b0-4a58-ac44-61a5cc36e8ea up in Southbound
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.231 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.230 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[23f09f85-59ee-4854-9522-ae5f876e73d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 systemd-machined[153381]: New machine qemu-3-instance-00000007.
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.2387] device (tap591b7425-a3): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.2393] device (tap591b7425-a3): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:05:56 np0005554845 systemd[1]: Started Virtual Machine qemu-3-instance-00000007.
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.256 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c08d4909-124d-4677-b1cd-edf2adb1a8fe]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.282 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[32814ba5-10e9-4164-a14b-faf9ebc206ab]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.2873] manager: (tapb2ae3067-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/35)
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.286 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[280974d7-0bc6-4df9-a982-d5c7f9f1131b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 systemd-udevd[214568]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.316 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e89457dc-237f-4fca-b2fc-c17d9ff23ccc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.320 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[090d2f04-2fb7-4120-a221-35b12cde06cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.3422] device (tapb2ae3067-f0): carrier: link connected
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.346 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[cbef3049-afdd-45bd-bcad-65616bfd9171]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.361 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[19928850-fcfc-4af1-add4-674fe9a38e31]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ae3067-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:6d:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343098, 'reachable_time': 17349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214596, 'error': None, 'target': 'ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.375 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f1297095-1265-42f1-97cf-6d3517c640ba]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe35:6d6c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 343098, 'tstamp': 343098}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 214597, 'error': None, 'target': 'ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.391 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[727c7b1e-f40e-4b0f-b011-b21812f34040]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapb2ae3067-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:35:6d:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 17], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343098, 'reachable_time': 17349, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 214598, 'error': None, 'target': 'ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.422 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ff1005fe-6f63-47b8-9fe9-e3815b5b0739]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.476 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a5b069eb-f380-4508-8363-f5003f191e5a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.477 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ae3067-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.478 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.478 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapb2ae3067-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.511 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 NetworkManager[55529]: <info>  [1765433156.5117] manager: (tapb2ae3067-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/36)
Dec 11 01:05:56 np0005554845 kernel: tapb2ae3067-f0: entered promiscuous mode
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.514 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.515 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapb2ae3067-f0, col_values=(('external_ids', {'iface-id': 'bcdface7-6ece-489b-93fc-d3872a13ae49'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:05:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:05:56Z|00048|binding|INFO|Releasing lport bcdface7-6ece-489b-93fc-d3872a13ae49 from this chassis (sb_readonly=0)
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.518 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/b2ae3067-f1da-4d23-957c-204bbcc0cd28.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/b2ae3067-f1da-4d23-957c-204bbcc0cd28.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.519 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ee34367b-1f4e-4a51-b527-956cef95fbdf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.520 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-b2ae3067-f1da-4d23-957c-204bbcc0cd28
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/b2ae3067-f1da-4d23-957c-204bbcc0cd28.pid.haproxy
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID b2ae3067-f1da-4d23-957c-204bbcc0cd28
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:05:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:05:56.521 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'env', 'PROCESS_TAG=haproxy-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/b2ae3067-f1da-4d23-957c-204bbcc0cd28.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.528 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.555 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433156.555079, e73aa485-0628-421b-b10a-b3e54bf3ba4a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.556 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] VM Started (Lifecycle Event)#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.572 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.575 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433156.5552218, e73aa485-0628-421b-b10a-b3e54bf3ba4a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.575 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.598 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.600 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:05:56 np0005554845 nova_compute[187128]: 2025-12-11 06:05:56.624 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:05:56 np0005554845 podman[214637]: 2025-12-11 06:05:56.893701841 +0000 UTC m=+0.046075851 container create 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:05:56 np0005554845 systemd[1]: Started libpod-conmon-4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0.scope.
Dec 11 01:05:56 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:05:56 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/52f579af5a84d0a514ef94aec23547726b239253921f00ffdfcca25661db1acc/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:05:56 np0005554845 podman[214637]: 2025-12-11 06:05:56.868559649 +0000 UTC m=+0.020933679 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:05:56 np0005554845 podman[214637]: 2025-12-11 06:05:56.973267851 +0000 UTC m=+0.125641881 container init 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:05:56 np0005554845 podman[214637]: 2025-12-11 06:05:56.982655726 +0000 UTC m=+0.135029736 container start 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:05:57 np0005554845 nova_compute[187128]: 2025-12-11 06:05:57.018 187132 DEBUG nova.network.neutron [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updated VIF entry in instance network info cache for port f7e89a08-ebcf-4928-85e4-e649df5a3196. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:05:57 np0005554845 nova_compute[187128]: 2025-12-11 06:05:57.019 187132 DEBUG nova.network.neutron [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:05:57 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [NOTICE]   (214656) : New worker (214658) forked
Dec 11 01:05:57 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [NOTICE]   (214656) : Loading success.
Dec 11 01:05:57 np0005554845 nova_compute[187128]: 2025-12-11 06:05:57.037 187132 DEBUG oslo_concurrency.lockutils [req-fad026ea-765c-4c25-9a79-e9251763f7cb req-8e4997e4-bd66-4ee3-aa3d-7fff9f8a30f8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:05:58 np0005554845 nova_compute[187128]: 2025-12-11 06:05:58.321 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-changed-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:05:58 np0005554845 nova_compute[187128]: 2025-12-11 06:05:58.322 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Refreshing instance network info cache due to event network-changed-591b7425-a3b0-4a58-ac44-61a5cc36e8ea. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:05:58 np0005554845 nova_compute[187128]: 2025-12-11 06:05:58.323 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:05:58 np0005554845 nova_compute[187128]: 2025-12-11 06:05:58.323 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:05:58 np0005554845 nova_compute[187128]: 2025-12-11 06:05:58.324 187132 DEBUG nova.network.neutron [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Refreshing network info cache for port 591b7425-a3b0-4a58-ac44-61a5cc36e8ea _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:05:59 np0005554845 nova_compute[187128]: 2025-12-11 06:05:59.829 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.076 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.077 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba" acquired by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.078 187132 DEBUG nova.compute.manager [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Going to confirm migration 2 do_confirm_resize /usr/lib/python3.9/site-packages/nova/compute/manager.py:4679#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.514 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.989 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433145.9893339, 4b205ab7-6622-4644-a404-ec948480d1ba => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:00 np0005554845 nova_compute[187128]: 2025-12-11 06:06:00.990 187132 INFO nova.compute.manager [-] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.018 187132 DEBUG neutronclient.v2_0.client [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port f7e89a08-ebcf-4928-85e4-e649df5a3196 for host compute-2.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.018 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.019 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.019 187132 DEBUG nova.network.neutron [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.019 187132 DEBUG nova.objects.instance [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'info_cache' on Instance uuid 4b205ab7-6622-4644-a404-ec948480d1ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.060 187132 DEBUG nova.compute.manager [None req-8b6c2d53-ab67-4c11-a8a7-ed95967acc3a - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.065 187132 DEBUG nova.compute.manager [None req-8b6c2d53-ab67-4c11-a8a7-ed95967acc3a - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: None, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.096 187132 INFO nova.compute.manager [None req-8b6c2d53-ab67-4c11-a8a7-ed95967acc3a - - - - - -] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.194 187132 DEBUG nova.compute.manager [req-70e12df3-20e2-4043-94db-3831b5651ecf req-83428131-2a9a-4847-9e78-d39a72c821e8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.194 187132 DEBUG oslo_concurrency.lockutils [req-70e12df3-20e2-4043-94db-3831b5651ecf req-83428131-2a9a-4847-9e78-d39a72c821e8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.195 187132 DEBUG oslo_concurrency.lockutils [req-70e12df3-20e2-4043-94db-3831b5651ecf req-83428131-2a9a-4847-9e78-d39a72c821e8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.195 187132 DEBUG oslo_concurrency.lockutils [req-70e12df3-20e2-4043-94db-3831b5651ecf req-83428131-2a9a-4847-9e78-d39a72c821e8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.196 187132 DEBUG nova.compute.manager [req-70e12df3-20e2-4043-94db-3831b5651ecf req-83428131-2a9a-4847-9e78-d39a72c821e8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Processing event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.197 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.201 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433161.2011094, e73aa485-0628-421b-b10a-b3e54bf3ba4a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.202 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.205 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.209 187132 INFO nova.virt.libvirt.driver [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Instance spawned successfully.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.209 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.229 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.235 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.239 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.239 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.240 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.240 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.241 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.242 187132 DEBUG nova.virt.libvirt.driver [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.282 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.318 187132 INFO nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Took 10.63 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.319 187132 DEBUG nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.404 187132 INFO nova.compute.manager [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Took 16.25 seconds to build instance.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.425 187132 DEBUG oslo_concurrency.lockutils [None req-0607f9cb-8d5b-4310-a8bc-62dc514d0a8f 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.356s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.483 187132 DEBUG nova.network.neutron [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Updated VIF entry in instance network info cache for port 591b7425-a3b0-4a58-ac44-61a5cc36e8ea. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.483 187132 DEBUG nova.network.neutron [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Updating instance_info_cache with network_info: [{"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.534 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e73aa485-0628-421b-b10a-b3e54bf3ba4a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.534 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.535 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.535 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.535 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.536 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] No waiting events found dispatching network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.536 187132 WARNING nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received unexpected event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 for instance with vm_state resized and task_state None.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.536 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.537 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.537 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.537 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.537 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] No waiting events found dispatching network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.538 187132 WARNING nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Received unexpected event network-vif-plugged-f7e89a08-ebcf-4928-85e4-e649df5a3196 for instance with vm_state resized and task_state None.#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.538 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.538 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.539 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.539 187132 DEBUG oslo_concurrency.lockutils [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.539 187132 DEBUG nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] No waiting events found dispatching network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:01 np0005554845 nova_compute[187128]: 2025-12-11 06:06:01.539 187132 WARNING nova.compute.manager [req-e9d7b600-f4cd-4297-a288-d07672db0275 req-c15ba3bf-22f2-433c-81e9-a3eab6dd4a70 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received unexpected event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.763 187132 DEBUG nova.network.neutron [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 4b205ab7-6622-4644-a404-ec948480d1ba] Updating instance_info_cache with network_info: [{"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.797 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-4b205ab7-6622-4644-a404-ec948480d1ba" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.797 187132 DEBUG nova.objects.instance [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid 4b205ab7-6622-4644-a404-ec948480d1ba obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.834 187132 DEBUG nova.virt.libvirt.host [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking UEFI support for host arch (x86_64) supports_uefi /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1754#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.835 187132 INFO nova.virt.libvirt.host [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] UEFI support detected#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.837 187132 DEBUG nova.virt.libvirt.vif [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:04:45Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1446310862',display_name='tempest-TestNetworkAdvancedServerOps-server-1446310862',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1446310862',id=2,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBNgM/qAsq/W6pl2lUgKsTugKsHWIU0fM/qatNQRqhtCy4/LE7hhfrHzJklvICL0fI2w3nlVpvfyHHrtL8lBcnC0/CqZJ2+IvhPWma1ca7/i1wyykXdHh2hWxolw5MKjldw==',key_name='tempest-TestNetworkAdvancedServerOps-1888270068',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:05:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-0.ctlplane.example.com',numa_topology=<?>,old_flavor=Flavor(1),os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-kodglh7t',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=<?>,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state=None,terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:05:56Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=4b205ab7-6622-4644-a404-ec948480d1ba,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.838 187132 DEBUG nova.network.os_vif_util [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "address": "fa:16:3e:5f:c0:6b", "network": {"id": "af86bfb7-241f-4a6e-8237-9d9593dd5fa4", "bridge": "br-int", "label": "tempest-network-smoke--1124514744", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.196", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf7e89a08-eb", "ovs_interfaceid": "f7e89a08-ebcf-4928-85e4-e649df5a3196", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.839 187132 DEBUG nova.network.os_vif_util [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.840 187132 DEBUG os_vif [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.842 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.842 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf7e89a08-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.842 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.845 187132 INFO os_vif [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5f:c0:6b,bridge_name='br-int',has_traffic_filtering=True,id=f7e89a08-ebcf-4928-85e4-e649df5a3196,network=Network(af86bfb7-241f-4a6e-8237-9d9593dd5fa4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf7e89a08-eb')#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.846 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:02 np0005554845 nova_compute[187128]: 2025-12-11 06:06:02.846 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:03 np0005554845 nova_compute[187128]: 2025-12-11 06:06:03.020 187132 DEBUG nova.compute.provider_tree [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:06:03 np0005554845 nova_compute[187128]: 2025-12-11 06:06:03.043 187132 DEBUG nova.scheduler.client.report [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:06:03 np0005554845 nova_compute[187128]: 2025-12-11 06:06:03.093 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.drop_move_claim_at_source" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:03 np0005554845 nova_compute[187128]: 2025-12-11 06:06:03.238 187132 INFO nova.scheduler.client.report [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocation for migration 068462b0-a8c3-48ae-9636-ddbab76810ef#033[00m
Dec 11 01:06:03 np0005554845 nova_compute[187128]: 2025-12-11 06:06:03.310 187132 DEBUG oslo_concurrency.lockutils [None req-f546efe6-653a-4719-b899-e783b9ddd678 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "4b205ab7-6622-4644-a404-ec948480d1ba" "released" by "nova.compute.manager.ComputeManager.confirm_resize.<locals>.do_confirm_resize" :: held 3.233s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:04 np0005554845 podman[214668]: 2025-12-11 06:06:04.1515481 +0000 UTC m=+0.069293852 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:06:04 np0005554845 nova_compute[187128]: 2025-12-11 06:06:04.832 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:05 np0005554845 nova_compute[187128]: 2025-12-11 06:06:05.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:09 np0005554845 nova_compute[187128]: 2025-12-11 06:06:09.833 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:10 np0005554845 podman[214694]: 2025-12-11 06:06:10.179737639 +0000 UTC m=+0.103234594 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ceilometer_agent_compute, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:06:10 np0005554845 nova_compute[187128]: 2025-12-11 06:06:10.519 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:12 np0005554845 podman[214714]: 2025-12-11 06:06:12.14031274 +0000 UTC m=+0.068986324 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 01:06:12 np0005554845 podman[214715]: 2025-12-11 06:06:12.18047472 +0000 UTC m=+0.103746037 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:06:13 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:13Z|00008|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:5c:9a:5a 10.100.0.24
Dec 11 01:06:13 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:13Z|00009|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:5c:9a:5a 10.100.0.24
Dec 11 01:06:14 np0005554845 nova_compute[187128]: 2025-12-11 06:06:14.836 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:15 np0005554845 podman[214781]: 2025-12-11 06:06:15.160511725 +0000 UTC m=+0.089011107 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true)
Dec 11 01:06:15 np0005554845 nova_compute[187128]: 2025-12-11 06:06:15.522 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:18.160 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=5, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=4) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.161 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:18.163 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.736 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.737 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.738 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:18 np0005554845 nova_compute[187128]: 2025-12-11 06:06:18.738 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.087 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.145 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.146 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.201 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.210 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.267 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.268 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.322 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.533 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.534 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5404MB free_disk=73.27304077148438GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.535 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.535 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:19 np0005554845 nova_compute[187128]: 2025-12-11 06:06:19.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.149 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance f2c66e64-57a7-4e97-8552-80a9d24397f6 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.150 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance e73aa485-0628-421b-b10a-b3e54bf3ba4a actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.150 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.152 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:06:20 np0005554845 podman[214815]: 2025-12-11 06:06:20.152357124 +0000 UTC m=+0.065013236 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:06:20 np0005554845 podman[214816]: 2025-12-11 06:06:20.183567031 +0000 UTC m=+0.102016150 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, io.buildah.version=1.33.7, version=9.6, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vcs-type=git, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, io.openshift.expose-services=, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.222 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.241 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.312 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.313 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.778s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:20 np0005554845 nova_compute[187128]: 2025-12-11 06:06:20.525 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:21 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:21.166 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '5'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.313 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.313 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.314 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.373 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.373 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.444 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.444 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.445 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.445 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.445 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.447 187132 INFO nova.compute.manager [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Terminating instance#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.449 187132 DEBUG nova.compute.manager [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:06:23 np0005554845 kernel: tap591b7425-a3 (unregistering): left promiscuous mode
Dec 11 01:06:23 np0005554845 NetworkManager[55529]: <info>  [1765433183.4701] device (tap591b7425-a3): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:06:23 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:23Z|00049|binding|INFO|Releasing lport 591b7425-a3b0-4a58-ac44-61a5cc36e8ea from this chassis (sb_readonly=0)
Dec 11 01:06:23 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:23Z|00050|binding|INFO|Setting lport 591b7425-a3b0-4a58-ac44-61a5cc36e8ea down in Southbound
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.477 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:23Z|00051|binding|INFO|Removing iface tap591b7425-a3 ovn-installed in OVS
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.479 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.488 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:5c:9a:5a 10.100.0.24'], port_security=['fa:16:3e:5c:9a:5a 10.100.0.24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.24/28', 'neutron:device_id': 'e73aa485-0628-421b-b10a-b3e54bf3ba4a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '4', 'neutron:security_group_ids': '410af5a4-e21c-40e1-a234-1ddcf5e427bb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=bd273b1b-48f9-4047-b963-1665d0628b77, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=591b7425-a3b0-4a58-ac44-61a5cc36e8ea) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.490 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.491 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 591b7425-a3b0-4a58-ac44-61a5cc36e8ea in datapath b2ae3067-f1da-4d23-957c-204bbcc0cd28 unbound from our chassis#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.493 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network b2ae3067-f1da-4d23-957c-204bbcc0cd28, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.493 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1c6134fa-dd40-49d6-841c-a97b558e0f3b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.494 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28 namespace which is not needed anymore#033[00m
Dec 11 01:06:23 np0005554845 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Deactivated successfully.
Dec 11 01:06:23 np0005554845 systemd[1]: machine-qemu\x2d3\x2dinstance\x2d00000007.scope: Consumed 12.887s CPU time.
Dec 11 01:06:23 np0005554845 systemd-machined[153381]: Machine qemu-3-instance-00000007 terminated.
Dec 11 01:06:23 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [NOTICE]   (214656) : haproxy version is 2.8.14-c23fe91
Dec 11 01:06:23 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [NOTICE]   (214656) : path to executable is /usr/sbin/haproxy
Dec 11 01:06:23 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [WARNING]  (214656) : Exiting Master process...
Dec 11 01:06:23 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [ALERT]    (214656) : Current worker (214658) exited with code 143 (Terminated)
Dec 11 01:06:23 np0005554845 neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28[214652]: [WARNING]  (214656) : All workers exited. Exiting... (0)
Dec 11 01:06:23 np0005554845 systemd[1]: libpod-4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0.scope: Deactivated successfully.
Dec 11 01:06:23 np0005554845 podman[214883]: 2025-12-11 06:06:23.632789522 +0000 UTC m=+0.044226971 container died 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:06:23 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0-userdata-shm.mount: Deactivated successfully.
Dec 11 01:06:23 np0005554845 systemd[1]: var-lib-containers-storage-overlay-52f579af5a84d0a514ef94aec23547726b239253921f00ffdfcca25661db1acc-merged.mount: Deactivated successfully.
Dec 11 01:06:23 np0005554845 podman[214883]: 2025-12-11 06:06:23.690432678 +0000 UTC m=+0.101870117 container cleanup 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:06:23 np0005554845 systemd[1]: libpod-conmon-4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0.scope: Deactivated successfully.
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.710 187132 INFO nova.virt.libvirt.driver [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Instance destroyed successfully.#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.710 187132 DEBUG nova.objects.instance [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'resources' on Instance uuid e73aa485-0628-421b-b10a-b3e54bf3ba4a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.728 187132 DEBUG nova.virt.libvirt.vif [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:05:42Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-585701095',display_name='tempest-TestNetworkBasicOps-server-585701095',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-585701095',id=7,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBA1Rc2GufyrylrTFfdnJn20kyoyDmcxP5+G4Aneow04i3QMtbJGJKtceg6bpqLLgR5U8DZ+NzKsYj9xJW/fyhHsuboRgC5gbQ9x1d57RrCzcNL2NTqWUcyRTRJlT/CGC/g==',key_name='tempest-TestNetworkBasicOps-343008503',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:06:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-tzltz3bt',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:06:01Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=e73aa485-0628-421b-b10a-b3e54bf3ba4a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.729 187132 DEBUG nova.network.os_vif_util [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "address": "fa:16:3e:5c:9a:5a", "network": {"id": "b2ae3067-f1da-4d23-957c-204bbcc0cd28", "bridge": "br-int", "label": "tempest-network-smoke--1881398563", "subnets": [{"cidr": "10.100.0.16/28", "dns": [], "gateway": {"address": "10.100.0.17", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.24", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap591b7425-a3", "ovs_interfaceid": "591b7425-a3b0-4a58-ac44-61a5cc36e8ea", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.729 187132 DEBUG nova.network.os_vif_util [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.730 187132 DEBUG os_vif [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.731 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.731 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap591b7425-a3, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.733 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.734 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.736 187132 INFO os_vif [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:5c:9a:5a,bridge_name='br-int',has_traffic_filtering=True,id=591b7425-a3b0-4a58-ac44-61a5cc36e8ea,network=Network(b2ae3067-f1da-4d23-957c-204bbcc0cd28),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap591b7425-a3')#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.737 187132 INFO nova.virt.libvirt.driver [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Deleting instance files /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a_del#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.738 187132 INFO nova.virt.libvirt.driver [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Deletion of /var/lib/nova/instances/e73aa485-0628-421b-b10a-b3e54bf3ba4a_del complete#033[00m
Dec 11 01:06:23 np0005554845 podman[214926]: 2025-12-11 06:06:23.75241698 +0000 UTC m=+0.040441189 container remove 4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.757 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[32df0d28-5a33-4e9a-bb60-aba507f64d6f]: (4, ('Thu Dec 11 06:06:23 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28 (4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0)\n4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0\nThu Dec 11 06:06:23 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28 (4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0)\n4eeb3a12a9d6c65f65a4ed324aac45f42f86c0e37d350863b8fb05f341ec66d0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.759 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[af956035-b91e-4efe-8d8d-361587d347bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.760 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapb2ae3067-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.762 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 kernel: tapb2ae3067-f0: left promiscuous mode
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.774 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.776 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[18999279-040f-4c57-89f0-49095de17c71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.791 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1e987e30-8112-4cde-ada3-37281d234427]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.792 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[52b0330d-119c-4377-a523-952562c1d388]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.795 187132 INFO nova.compute.manager [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.796 187132 DEBUG oslo.service.loopingcall [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.796 187132 DEBUG nova.compute.manager [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:06:23 np0005554845 nova_compute[187128]: 2025-12-11 06:06:23.796 187132 DEBUG nova.network.neutron [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.806 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9256a9ab-7b9a-4bdb-8981-d33079de182a]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 343092, 'reachable_time': 30139, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 214943, 'error': None, 'target': 'ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.808 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-b2ae3067-f1da-4d23-957c-204bbcc0cd28 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:06:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:23.808 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[11c4ec6e-0cd5-4d96-8a64-62ca8aacda8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:23 np0005554845 systemd[1]: run-netns-ovnmeta\x2db2ae3067\x2df1da\x2d4d23\x2d957c\x2d204bbcc0cd28.mount: Deactivated successfully.
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.363 187132 DEBUG nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-unplugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.363 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.363 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.363 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] No waiting events found dispatching network-vif-unplugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-unplugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.364 187132 DEBUG oslo_concurrency.lockutils [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.365 187132 DEBUG nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] No waiting events found dispatching network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.365 187132 WARNING nova.compute.manager [req-0905e752-da88-45d5-b47f-8a7c5f35f0b0 req-2b71062a-4506-4965-9916-17816d3be8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received unexpected event network-vif-plugged-591b7425-a3b0-4a58-ac44-61a5cc36e8ea for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:24 np0005554845 nova_compute[187128]: 2025-12-11 06:06:24.840 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.143 187132 DEBUG nova.network.neutron [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.174 187132 INFO nova.compute.manager [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Took 1.38 seconds to deallocate network for instance.#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.218 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.219 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.310 187132 DEBUG nova.compute.provider_tree [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.333 187132 DEBUG nova.scheduler.client.report [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.358 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.397 187132 INFO nova.scheduler.client.report [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Deleted allocations for instance e73aa485-0628-421b-b10a-b3e54bf3ba4a#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.481 187132 DEBUG oslo_concurrency.lockutils [None req-0ca8f12a-8459-43c9-8343-139688156ab7 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "e73aa485-0628-421b-b10a-b3e54bf3ba4a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.036s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:25 np0005554845 nova_compute[187128]: 2025-12-11 06:06:25.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:06:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:26.219 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:26.220 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:26.221 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:26 np0005554845 nova_compute[187128]: 2025-12-11 06:06:26.456 187132 DEBUG nova.compute.manager [req-c11ec356-3cf5-4445-bf69-f978176b5371 req-8a8fb679-45ad-437d-bdb8-68e063c7a280 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Received event network-vif-deleted-591b7425-a3b0-4a58-ac44-61a5cc36e8ea external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:26 np0005554845 nova_compute[187128]: 2025-12-11 06:06:26.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:06:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:27Z|00052|binding|INFO|Releasing lport c8ffeef2-7a6e-414a-8ca6-6cf7e6bf2700 from this chassis (sb_readonly=0)
Dec 11 01:06:27 np0005554845 nova_compute[187128]: 2025-12-11 06:06:27.193 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.733 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.742 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.742 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.743 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.743 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.743 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.745 187132 INFO nova.compute.manager [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Terminating instance#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.746 187132 DEBUG nova.compute.manager [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:06:28 np0005554845 kernel: tap6b225150-80 (unregistering): left promiscuous mode
Dec 11 01:06:28 np0005554845 NetworkManager[55529]: <info>  [1765433188.7724] device (tap6b225150-80): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:06:28 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:28Z|00053|binding|INFO|Releasing lport 6b225150-8014-4488-91e5-7faf65ace151 from this chassis (sb_readonly=0)
Dec 11 01:06:28 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:28Z|00054|binding|INFO|Setting lport 6b225150-8014-4488-91e5-7faf65ace151 down in Southbound
Dec 11 01:06:28 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:28Z|00055|binding|INFO|Removing iface tap6b225150-80 ovn-installed in OVS
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.781 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.783 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:28.791 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8c:57:fd 10.100.0.5'], port_security=['fa:16:3e:8c:57:fd 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': 'f2c66e64-57a7-4e97-8552-80a9d24397f6', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-869c578a-42b0-4a82-a564-a3681a196ad7', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dbc12d2a-513a-45e0-9da7-c3b6cdd3e2e7', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=72246518-7492-4032-b5a8-4189af5b12a8, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=6b225150-8014-4488-91e5-7faf65ace151) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:06:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:28.793 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 6b225150-8014-4488-91e5-7faf65ace151 in datapath 869c578a-42b0-4a82-a564-a3681a196ad7 unbound from our chassis#033[00m
Dec 11 01:06:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:28.796 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 869c578a-42b0-4a82-a564-a3681a196ad7, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:06:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:28.796 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7b2aa98a-7ce1-4f6a-9248-7c8d1425c366]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:28 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:28.797 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7 namespace which is not needed anymore#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.799 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Deactivated successfully.
Dec 11 01:06:28 np0005554845 systemd[1]: machine-qemu\x2d1\x2dinstance\x2d00000001.scope: Consumed 15.834s CPU time.
Dec 11 01:06:28 np0005554845 systemd-machined[153381]: Machine qemu-1-instance-00000001 terminated.
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.919 187132 DEBUG nova.compute.manager [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-changed-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.919 187132 DEBUG nova.compute.manager [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing instance network info cache due to event network-changed-6b225150-8014-4488-91e5-7faf65ace151. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.920 187132 DEBUG oslo_concurrency.lockutils [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.920 187132 DEBUG oslo_concurrency.lockutils [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.920 187132 DEBUG nova.network.neutron [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Refreshing network info cache for port 6b225150-8014-4488-91e5-7faf65ace151 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [NOTICE]   (214006) : haproxy version is 2.8.14-c23fe91
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [NOTICE]   (214006) : path to executable is /usr/sbin/haproxy
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [WARNING]  (214006) : Exiting Master process...
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [WARNING]  (214006) : Exiting Master process...
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [ALERT]    (214006) : Current worker (214008) exited with code 143 (Terminated)
Dec 11 01:06:28 np0005554845 neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7[214002]: [WARNING]  (214006) : All workers exited. Exiting... (0)
Dec 11 01:06:28 np0005554845 systemd[1]: libpod-1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165.scope: Deactivated successfully.
Dec 11 01:06:28 np0005554845 podman[214968]: 2025-12-11 06:06:28.949080446 +0000 UTC m=+0.043661686 container died 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.971 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 nova_compute[187128]: 2025-12-11 06:06:28.977 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:28 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165-userdata-shm.mount: Deactivated successfully.
Dec 11 01:06:29 np0005554845 systemd[1]: var-lib-containers-storage-overlay-0c1f5029eaa9687fb645cb97918d9fa85c2f210e49fb62adeb9a75d621d9890a-merged.mount: Deactivated successfully.
Dec 11 01:06:29 np0005554845 podman[214968]: 2025-12-11 06:06:29.009946248 +0000 UTC m=+0.104527488 container cleanup 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.016 187132 INFO nova.virt.libvirt.driver [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Instance destroyed successfully.#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.017 187132 DEBUG nova.objects.instance [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'resources' on Instance uuid f2c66e64-57a7-4e97-8552-80a9d24397f6 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:29 np0005554845 systemd[1]: libpod-conmon-1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165.scope: Deactivated successfully.
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.033 187132 DEBUG nova.virt.libvirt.vif [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:04:44Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1021466613',display_name='tempest-TestNetworkBasicOps-server-1021466613',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1021466613',id=1,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBH+j5FDWj2GbHAPAE98Uw4tQgPQNj9jziabv99iPtZbTEcOFL2RudLP/QAtVoXbHMhkSVxf71retgLVhjIxHZe68LaLI6P9zas5/bYFBwjZF2VjQRJddZnimyVDztq19nQ==',key_name='tempest-TestNetworkBasicOps-1647276356',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:05:11Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-nfr2nn4p',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:05:11Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=f2c66e64-57a7-4e97-8552-80a9d24397f6,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.034 187132 DEBUG nova.network.os_vif_util [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.209", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.034 187132 DEBUG nova.network.os_vif_util [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.035 187132 DEBUG os_vif [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.036 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.037 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap6b225150-80, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.041 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.043 187132 INFO os_vif [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8c:57:fd,bridge_name='br-int',has_traffic_filtering=True,id=6b225150-8014-4488-91e5-7faf65ace151,network=Network(869c578a-42b0-4a82-a564-a3681a196ad7),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap6b225150-80')#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.044 187132 INFO nova.virt.libvirt.driver [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Deleting instance files /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6_del#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.044 187132 INFO nova.virt.libvirt.driver [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Deletion of /var/lib/nova/instances/f2c66e64-57a7-4e97-8552-80a9d24397f6_del complete#033[00m
Dec 11 01:06:29 np0005554845 podman[215009]: 2025-12-11 06:06:29.083165016 +0000 UTC m=+0.046709089 container remove 1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.089 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0547382d-76e9-491b-b2f7-8f1516da4541]: (4, ('Thu Dec 11 06:06:28 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7 (1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165)\n1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165\nThu Dec 11 06:06:29 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7 (1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165)\n1c1665d3bea4a0445a092689e02b3ad0a84d06758c40eded92ce46de18305165\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.091 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[04a52b3d-3670-4e8e-b3ef-1e850aae169a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.091 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap869c578a-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.093 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:29 np0005554845 kernel: tap869c578a-40: left promiscuous mode
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.117 187132 INFO nova.compute.manager [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Took 0.37 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.118 187132 DEBUG oslo.service.loopingcall [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.118 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.119 187132 DEBUG nova.compute.manager [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.120 187132 DEBUG nova.network.neutron [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.122 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2c8cd7-3861-4220-9bb8-2313dd0d3e1a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.135 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8201d516-9fdf-42d2-a3b7-f96c7054fb79]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.137 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2c2047ea-806c-43f0-b332-84a92b6ccbfc]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.155 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[983d365e-04b2-4bb4-931b-abbd66300b41]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 337661, 'reachable_time': 44418, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215024, 'error': None, 'target': 'ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.157 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-869c578a-42b0-4a82-a564-a3681a196ad7 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:06:29 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:29.157 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[50d7604d-d688-4799-a1a1-ad756d6ea034]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:29 np0005554845 systemd[1]: run-netns-ovnmeta\x2d869c578a\x2d42b0\x2d4a82\x2da564\x2da3681a196ad7.mount: Deactivated successfully.
Dec 11 01:06:29 np0005554845 nova_compute[187128]: 2025-12-11 06:06:29.842 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:30 np0005554845 nova_compute[187128]: 2025-12-11 06:06:30.820 187132 DEBUG nova.network.neutron [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:30 np0005554845 nova_compute[187128]: 2025-12-11 06:06:30.854 187132 INFO nova.compute.manager [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Took 1.73 seconds to deallocate network for instance.#033[00m
Dec 11 01:06:30 np0005554845 nova_compute[187128]: 2025-12-11 06:06:30.947 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:30 np0005554845 nova_compute[187128]: 2025-12-11 06:06:30.948 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.013 187132 DEBUG nova.compute.provider_tree [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.029 187132 DEBUG nova.scheduler.client.report [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.056 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.093 187132 INFO nova.scheduler.client.report [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Deleted allocations for instance f2c66e64-57a7-4e97-8552-80a9d24397f6#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.150 187132 DEBUG nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-vif-unplugged-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.151 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.151 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.151 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.152 187132 DEBUG nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] No waiting events found dispatching network-vif-unplugged-6b225150-8014-4488-91e5-7faf65ace151 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.152 187132 WARNING nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received unexpected event network-vif-unplugged-6b225150-8014-4488-91e5-7faf65ace151 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.152 187132 DEBUG nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.152 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.152 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.153 187132 DEBUG oslo_concurrency.lockutils [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.153 187132 DEBUG nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] No waiting events found dispatching network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.153 187132 WARNING nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received unexpected event network-vif-plugged-6b225150-8014-4488-91e5-7faf65ace151 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.153 187132 DEBUG nova.compute.manager [req-e911afe1-e93d-49c8-8b42-46eac54dd949 req-45edd899-28a5-438d-87bb-d09c76361026 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Received event network-vif-deleted-6b225150-8014-4488-91e5-7faf65ace151 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.164 187132 DEBUG oslo_concurrency.lockutils [None req-d555ae3c-a38e-4fad-95a1-f3c0e4abafa8 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "f2c66e64-57a7-4e97-8552-80a9d24397f6" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.421s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.865 187132 DEBUG nova.network.neutron [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updated VIF entry in instance network info cache for port 6b225150-8014-4488-91e5-7faf65ace151. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.866 187132 DEBUG nova.network.neutron [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Updating instance_info_cache with network_info: [{"id": "6b225150-8014-4488-91e5-7faf65ace151", "address": "fa:16:3e:8c:57:fd", "network": {"id": "869c578a-42b0-4a82-a564-a3681a196ad7", "bridge": "br-int", "label": "tempest-network-smoke--479030848", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap6b225150-80", "ovs_interfaceid": "6b225150-8014-4488-91e5-7faf65ace151", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:31 np0005554845 nova_compute[187128]: 2025-12-11 06:06:31.906 187132 DEBUG oslo_concurrency.lockutils [req-aa9449e1-09b1-4227-90e9-2ca3a8c13e14 req-f5eb9a20-a924-47a4-9de8-ea4a38cd5a65 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-f2c66e64-57a7-4e97-8552-80a9d24397f6" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:06:33 np0005554845 nova_compute[187128]: 2025-12-11 06:06:33.433 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:33 np0005554845 nova_compute[187128]: 2025-12-11 06:06:33.789 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:34 np0005554845 nova_compute[187128]: 2025-12-11 06:06:34.039 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:34 np0005554845 nova_compute[187128]: 2025-12-11 06:06:34.844 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:35 np0005554845 podman[215026]: 2025-12-11 06:06:35.159501162 +0000 UTC m=+0.077618919 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:06:38 np0005554845 nova_compute[187128]: 2025-12-11 06:06:38.709 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433183.708269, e73aa485-0628-421b-b10a-b3e54bf3ba4a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:38 np0005554845 nova_compute[187128]: 2025-12-11 06:06:38.710 187132 INFO nova.compute.manager [-] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:06:38 np0005554845 nova_compute[187128]: 2025-12-11 06:06:38.740 187132 DEBUG nova.compute.manager [None req-cc66b594-b476-4810-84a0-4d760b45fe56 - - - - - -] [instance: e73aa485-0628-421b-b10a-b3e54bf3ba4a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:39 np0005554845 nova_compute[187128]: 2025-12-11 06:06:39.041 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:39 np0005554845 nova_compute[187128]: 2025-12-11 06:06:39.846 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:41 np0005554845 podman[215050]: 2025-12-11 06:06:41.172127208 +0000 UTC m=+0.095947635 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm)
Dec 11 01:06:43 np0005554845 podman[215070]: 2025-12-11 06:06:43.154115881 +0000 UTC m=+0.073196039 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 01:06:43 np0005554845 podman[215071]: 2025-12-11 06:06:43.21452176 +0000 UTC m=+0.120789989 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 11 01:06:44 np0005554845 nova_compute[187128]: 2025-12-11 06:06:44.014 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433189.0133817, f2c66e64-57a7-4e97-8552-80a9d24397f6 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:44 np0005554845 nova_compute[187128]: 2025-12-11 06:06:44.014 187132 INFO nova.compute.manager [-] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:06:44 np0005554845 nova_compute[187128]: 2025-12-11 06:06:44.040 187132 DEBUG nova.compute.manager [None req-77171a4a-cb6a-4e48-8e56-1d959f83325c - - - - - -] [instance: f2c66e64-57a7-4e97-8552-80a9d24397f6] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:44 np0005554845 nova_compute[187128]: 2025-12-11 06:06:44.044 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:44 np0005554845 nova_compute[187128]: 2025-12-11 06:06:44.847 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:46 np0005554845 podman[215117]: 2025-12-11 06:06:46.134195237 +0000 UTC m=+0.062381724 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd)
Dec 11 01:06:47 np0005554845 nova_compute[187128]: 2025-12-11 06:06:47.989 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:47 np0005554845 nova_compute[187128]: 2025-12-11 06:06:47.990 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.017 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.165 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.166 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.174 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.174 187132 INFO nova.compute.claims [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.347 187132 DEBUG nova.compute.provider_tree [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.368 187132 DEBUG nova.scheduler.client.report [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.392 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.226s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.393 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.728 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.729 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.755 187132 INFO nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.781 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.894 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.895 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.896 187132 INFO nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Creating image(s)#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.897 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.897 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.898 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.912 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.989 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.077s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.990 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:48 np0005554845 nova_compute[187128]: 2025-12-11 06:06:48.991 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.001 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.046 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.061 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.062 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.249 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk 1073741824" returned: 0 in 0.187s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.251 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.260s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.251 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.318 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.321 187132 DEBUG nova.virt.disk.api [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.322 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.402 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.081s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.404 187132 DEBUG nova.virt.disk.api [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.404 187132 DEBUG nova.objects.instance [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.431 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.432 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Ensure instance console log exists: /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.433 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.434 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.434 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.453 187132 DEBUG nova.policy [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:06:49 np0005554845 nova_compute[187128]: 2025-12-11 06:06:49.849 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:51 np0005554845 podman[215153]: 2025-12-11 06:06:51.136708864 +0000 UTC m=+0.061359567 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:06:51 np0005554845 podman[215154]: 2025-12-11 06:06:51.156559893 +0000 UTC m=+0.075936232 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, release=1755695350, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, version=9.6, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 01:06:51 np0005554845 nova_compute[187128]: 2025-12-11 06:06:51.429 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Successfully created port: aee944ef-3d55-4d72-85fd-0bcba5cebad9 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:06:52 np0005554845 nova_compute[187128]: 2025-12-11 06:06:52.948 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Successfully updated port: aee944ef-3d55-4d72-85fd-0bcba5cebad9 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:06:52 np0005554845 nova_compute[187128]: 2025-12-11 06:06:52.975 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:06:52 np0005554845 nova_compute[187128]: 2025-12-11 06:06:52.976 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:06:52 np0005554845 nova_compute[187128]: 2025-12-11 06:06:52.976 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:06:53 np0005554845 nova_compute[187128]: 2025-12-11 06:06:53.566 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:06:53 np0005554845 nova_compute[187128]: 2025-12-11 06:06:53.600 187132 DEBUG nova.compute.manager [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:53 np0005554845 nova_compute[187128]: 2025-12-11 06:06:53.601 187132 DEBUG nova.compute.manager [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing instance network info cache due to event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:06:53 np0005554845 nova_compute[187128]: 2025-12-11 06:06:53.601 187132 DEBUG oslo_concurrency.lockutils [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:06:54 np0005554845 nova_compute[187128]: 2025-12-11 06:06:54.049 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:54 np0005554845 nova_compute[187128]: 2025-12-11 06:06:54.851 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.351 187132 DEBUG nova.network.neutron [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.380 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.380 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance network_info: |[{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.381 187132 DEBUG oslo_concurrency.lockutils [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.381 187132 DEBUG nova.network.neutron [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.385 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start _get_guest_xml network_info=[{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.392 187132 WARNING nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.399 187132 DEBUG nova.virt.libvirt.host [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.400 187132 DEBUG nova.virt.libvirt.host [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.412 187132 DEBUG nova.virt.libvirt.host [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.413 187132 DEBUG nova.virt.libvirt.host [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.415 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.415 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.416 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.416 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.416 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.417 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.417 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.417 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.418 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.418 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.418 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.419 187132 DEBUG nova.virt.hardware [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.423 187132 DEBUG nova.virt.libvirt.vif [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:06:48Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.424 187132 DEBUG nova.network.os_vif_util [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.425 187132 DEBUG nova.network.os_vif_util [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.426 187132 DEBUG nova.objects.instance [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.443 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <uuid>d29187d8-59e6-4e5a-aef7-97fef6cf24c7</uuid>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <name>instance-0000000a</name>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1186378686</nova:name>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:06:55</nova:creationTime>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        <nova:port uuid="aee944ef-3d55-4d72-85fd-0bcba5cebad9">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="serial">d29187d8-59e6-4e5a-aef7-97fef6cf24c7</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="uuid">d29187d8-59e6-4e5a-aef7-97fef6cf24c7</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:39:5c:9d"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <target dev="tapaee944ef-3d"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/console.log" append="off"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:06:55 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:06:55 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:06:55 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:06:55 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.444 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Preparing to wait for external event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.445 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.445 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.445 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.446 187132 DEBUG nova.virt.libvirt.vif [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:06:48Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.446 187132 DEBUG nova.network.os_vif_util [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.447 187132 DEBUG nova.network.os_vif_util [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.447 187132 DEBUG os_vif [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.448 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.448 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.449 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.452 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaee944ef-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.452 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaee944ef-3d, col_values=(('external_ids', {'iface-id': 'aee944ef-3d55-4d72-85fd-0bcba5cebad9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:5c:9d', 'vm-uuid': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.454 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:55 np0005554845 NetworkManager[55529]: <info>  [1765433215.4556] manager: (tapaee944ef-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/37)
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.456 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.459 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.460 187132 INFO os_vif [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d')#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.529 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.530 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.530 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:39:5c:9d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:06:55 np0005554845 nova_compute[187128]: 2025-12-11 06:06:55.531 187132 INFO nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Using config drive#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.216 187132 INFO nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Creating config drive at /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.222 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxnjjmvue execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.352 187132 DEBUG oslo_concurrency.processutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpxnjjmvue" returned: 0 in 0.131s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:06:56 np0005554845 kernel: tapaee944ef-3d: entered promiscuous mode
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.4227] manager: (tapaee944ef-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/38)
Dec 11 01:06:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:56Z|00056|binding|INFO|Claiming lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 for this chassis.
Dec 11 01:06:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:56Z|00057|binding|INFO|aee944ef-3d55-4d72-85fd-0bcba5cebad9: Claiming fa:16:3e:39:5c:9d 10.100.0.14
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.424 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.429 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.431 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.449 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:5c:9d 10.100.0.14'], port_security=['fa:16:3e:39:5c:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': '25d5132c-a309-410e-93c9-7759e7948f62', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114c3962-e260-4a4f-84c2-081b45071782, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=aee944ef-3d55-4d72-85fd-0bcba5cebad9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.450 104320 INFO neutron.agent.ovn.metadata.agent [-] Port aee944ef-3d55-4d72-85fd-0bcba5cebad9 in datapath fa8f22dd-28ac-458d-9f63-a7d8a915d217 bound to our chassis#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.452 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa8f22dd-28ac-458d-9f63-a7d8a915d217#033[00m
Dec 11 01:06:56 np0005554845 systemd-udevd[215217]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:06:56 np0005554845 systemd-machined[153381]: New machine qemu-4-instance-0000000a.
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.465 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cf4e2fa4-7214-4175-b5d4-2a840d865a4d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.466 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa8f22dd-21 in ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.468 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa8f22dd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.468 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[82a3e7a1-5cc9-4098-a2d2-90a742bf5df3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.468 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[75175afb-f349-4196-87be-782ae6e6967f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.478 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[b114e5aa-daa6-4005-965d-aa61de71532d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.4810] device (tapaee944ef-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.4820] device (tapaee944ef-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:06:56 np0005554845 systemd[1]: Started Virtual Machine qemu-4-instance-0000000a.
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.502 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:56Z|00058|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 ovn-installed in OVS
Dec 11 01:06:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:56Z|00059|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 up in Southbound
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.507 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.512 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cf9aea4a-2034-4b70-9370-573f65ea9516]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.539 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[9596f591-f723-4fcb-86da-74ec673f79f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.544 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[552e6a09-6e26-425a-8375-75b654f7c74d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.5458] manager: (tapfa8f22dd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/39)
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.576 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[4271bc3c-e9ee-461c-b8ea-81224795063e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.578 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0c3c8739-85ff-40db-8e87-04afacfb9ea0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.5989] device (tapfa8f22dd-20): carrier: link connected
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.602 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[b69220f2-461e-4695-b28a-7f3715d26088]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.620 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6cb9d40c-1c65-48b9-ba17-ed3f0431aa14]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa8f22dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:f5:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349124, 'reachable_time': 36035, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215249, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.634 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fb369286-74f6-4ee9-b565-8ff303fcd61f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:f5a2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 349124, 'tstamp': 349124}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215250, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.649 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[71fc4845-e3d0-4d17-8ee8-fde8a5242c96]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa8f22dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:f5:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 21], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349124, 'reachable_time': 36035, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215251, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.677 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ae571097-7513-4232-aeda-fe4241a18e4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.737 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[18599d04-53f8-413b-b1b0-da301f370684]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.738 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa8f22dd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.738 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.738 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa8f22dd-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.771 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 NetworkManager[55529]: <info>  [1765433216.7735] manager: (tapfa8f22dd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/40)
Dec 11 01:06:56 np0005554845 kernel: tapfa8f22dd-20: entered promiscuous mode
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.776 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.777 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa8f22dd-20, col_values=(('external_ids', {'iface-id': 'c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.779 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:06:56Z|00060|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.803 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.804 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.804 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.805 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fa1ef80f-700b-437b-84ce-fada2189b2bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.805 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-fa8f22dd-28ac-458d-9f63-a7d8a915d217
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID fa8f22dd-28ac-458d-9f63-a7d8a915d217
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:06:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:06:56.806 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'env', 'PROCESS_TAG=haproxy-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa8f22dd-28ac-458d-9f63-a7d8a915d217.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.985 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433216.9851246, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:56 np0005554845 nova_compute[187128]: 2025-12-11 06:06:56.986 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Started (Lifecycle Event)#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.015 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.020 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433216.9854412, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.021 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.045 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.048 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:06:57 np0005554845 nova_compute[187128]: 2025-12-11 06:06:57.071 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:06:57 np0005554845 podman[215290]: 2025-12-11 06:06:57.201706682 +0000 UTC m=+0.057393669 container create 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0)
Dec 11 01:06:57 np0005554845 systemd[1]: Started libpod-conmon-19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4.scope.
Dec 11 01:06:57 np0005554845 podman[215290]: 2025-12-11 06:06:57.169775395 +0000 UTC m=+0.025462372 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:06:57 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:06:57 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/d0a0c4dccc1e1d13e6fe6ee229bfb549d2a5b8cedb6537a24f030d6c3608ac6a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:06:57 np0005554845 podman[215290]: 2025-12-11 06:06:57.290236495 +0000 UTC m=+0.145923532 container init 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:06:57 np0005554845 podman[215290]: 2025-12-11 06:06:57.298944491 +0000 UTC m=+0.154631478 container start 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:06:57 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [NOTICE]   (215310) : New worker (215312) forked
Dec 11 01:06:57 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [NOTICE]   (215310) : Loading success.
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.193 187132 DEBUG nova.network.neutron [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated VIF entry in instance network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.194 187132 DEBUG nova.network.neutron [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.213 187132 DEBUG oslo_concurrency.lockutils [req-1cfb73ac-07cd-49eb-a4ce-e856c425feaa req-a367a557-778b-4cdd-92e3-0cb3472b69a5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.612 187132 DEBUG nova.compute.manager [req-5e3e66f8-3ce3-4248-92f0-1b05d053b8ec req-c408108b-fe5f-4236-806b-47aaf9c70869 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.612 187132 DEBUG oslo_concurrency.lockutils [req-5e3e66f8-3ce3-4248-92f0-1b05d053b8ec req-c408108b-fe5f-4236-806b-47aaf9c70869 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.613 187132 DEBUG oslo_concurrency.lockutils [req-5e3e66f8-3ce3-4248-92f0-1b05d053b8ec req-c408108b-fe5f-4236-806b-47aaf9c70869 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.613 187132 DEBUG oslo_concurrency.lockutils [req-5e3e66f8-3ce3-4248-92f0-1b05d053b8ec req-c408108b-fe5f-4236-806b-47aaf9c70869 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.613 187132 DEBUG nova.compute.manager [req-5e3e66f8-3ce3-4248-92f0-1b05d053b8ec req-c408108b-fe5f-4236-806b-47aaf9c70869 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Processing event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.614 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.619 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433219.6187634, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.619 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.621 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.625 187132 INFO nova.virt.libvirt.driver [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance spawned successfully.#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.626 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.639 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.646 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.649 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.650 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.650 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.651 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.651 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.651 187132 DEBUG nova.virt.libvirt.driver [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.724 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.765 187132 INFO nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Took 10.87 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.765 187132 DEBUG nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.853 187132 INFO nova.compute.manager [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Took 11.73 seconds to build instance.#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.855 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:06:59 np0005554845 nova_compute[187128]: 2025-12-11 06:06:59.872 187132 DEBUG oslo_concurrency.lockutils [None req-b7ec0007-933c-4fc4-803e-97d7512d4394 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.882s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:00 np0005554845 nova_compute[187128]: 2025-12-11 06:07:00.455 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.704 187132 DEBUG nova.compute.manager [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.705 187132 DEBUG oslo_concurrency.lockutils [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.705 187132 DEBUG oslo_concurrency.lockutils [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.705 187132 DEBUG oslo_concurrency.lockutils [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.705 187132 DEBUG nova.compute.manager [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:03 np0005554845 nova_compute[187128]: 2025-12-11 06:07:03.706 187132 WARNING nova.compute.manager [req-8909751b-3144-4843-b2cb-8a286bd6508e req-45869ded-7046-420b-b646-2744ea69669a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:07:04 np0005554845 nova_compute[187128]: 2025-12-11 06:07:04.855 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:05 np0005554845 nova_compute[187128]: 2025-12-11 06:07:05.458 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:06 np0005554845 podman[215321]: 2025-12-11 06:07:06.198756153 +0000 UTC m=+0.116905174 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:07:09 np0005554845 nova_compute[187128]: 2025-12-11 06:07:09.857 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:10 np0005554845 nova_compute[187128]: 2025-12-11 06:07:10.461 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:11 np0005554845 nova_compute[187128]: 2025-12-11 06:07:11.798 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:11 np0005554845 NetworkManager[55529]: <info>  [1765433231.7991] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/41)
Dec 11 01:07:11 np0005554845 NetworkManager[55529]: <info>  [1765433231.8018] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/42)
Dec 11 01:07:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:12Z|00061|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:07:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:12Z|00062|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:07:12 np0005554845 nova_compute[187128]: 2025-12-11 06:07:12.067 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:12 np0005554845 podman[215368]: 2025-12-11 06:07:12.191754297 +0000 UTC m=+0.092746130 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:12 np0005554845 nova_compute[187128]: 2025-12-11 06:07:12.398 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:12Z|00010|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:39:5c:9d 10.100.0.14
Dec 11 01:07:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:12Z|00011|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:5c:9d 10.100.0.14
Dec 11 01:07:14 np0005554845 podman[215391]: 2025-12-11 06:07:14.181153661 +0000 UTC m=+0.108154062 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:14 np0005554845 podman[215392]: 2025-12-11 06:07:14.192296913 +0000 UTC m=+0.115355037 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller)
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.267 187132 DEBUG nova.compute.manager [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.267 187132 DEBUG nova.compute.manager [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing instance network info cache due to event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.268 187132 DEBUG oslo_concurrency.lockutils [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.268 187132 DEBUG oslo_concurrency.lockutils [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.268 187132 DEBUG nova.network.neutron [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:07:14 np0005554845 nova_compute[187128]: 2025-12-11 06:07:14.858 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:15 np0005554845 nova_compute[187128]: 2025-12-11 06:07:15.463 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:17 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:17Z|00063|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:07:17 np0005554845 nova_compute[187128]: 2025-12-11 06:07:17.144 187132 DEBUG nova.network.neutron [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated VIF entry in instance network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:07:17 np0005554845 nova_compute[187128]: 2025-12-11 06:07:17.145 187132 DEBUG nova.network.neutron [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:17 np0005554845 podman[215430]: 2025-12-11 06:07:17.163167978 +0000 UTC m=+0.084367797 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:07:17 np0005554845 nova_compute[187128]: 2025-12-11 06:07:17.165 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:17 np0005554845 nova_compute[187128]: 2025-12-11 06:07:17.168 187132 DEBUG oslo_concurrency.lockutils [req-1c93168b-fec0-47f7-b869-e2a83362f44f req-56d453d2-29cd-4457-8f36-95f8a68fc00b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.724 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.724 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.725 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.725 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.805 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.873 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.874 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:18 np0005554845 nova_compute[187128]: 2025-12-11 06:07:18.955 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.080s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.061 187132 INFO nova.compute.manager [None req-4acd5b10-6a89-4442-b91f-82cde9659b5d 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Get console output#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.065 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.117 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.117 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5574MB free_disk=73.30229568481445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.118 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.118 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.206 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance d29187d8-59e6-4e5a-aef7-97fef6cf24c7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.206 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.206 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.249 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.262 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.280 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.281 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.162s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:19.395 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=6, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=5) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.397 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:19.399 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.717 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:19 np0005554845 nova_compute[187128]: 2025-12-11 06:07:19.861 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:20 np0005554845 nova_compute[187128]: 2025-12-11 06:07:20.466 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:21 np0005554845 nova_compute[187128]: 2025-12-11 06:07:21.691 187132 INFO nova.compute.manager [None req-e1d5ba46-be80-42e3-b1e8-61e42baf841a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Get console output#033[00m
Dec 11 01:07:21 np0005554845 nova_compute[187128]: 2025-12-11 06:07:21.696 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:07:21 np0005554845 podman[215455]: 2025-12-11 06:07:21.768345771 +0000 UTC m=+0.050928161 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:07:21 np0005554845 podman[215456]: 2025-12-11 06:07:21.804426369 +0000 UTC m=+0.073309507 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.buildah.version=1.33.7, release=1755695350)
Dec 11 01:07:22 np0005554845 nova_compute[187128]: 2025-12-11 06:07:22.281 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:22 np0005554845 nova_compute[187128]: 2025-12-11 06:07:22.649 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:22 np0005554845 nova_compute[187128]: 2025-12-11 06:07:22.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:22 np0005554845 nova_compute[187128]: 2025-12-11 06:07:22.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:07:22 np0005554845 nova_compute[187128]: 2025-12-11 06:07:22.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:07:23 np0005554845 nova_compute[187128]: 2025-12-11 06:07:23.058 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:23 np0005554845 nova_compute[187128]: 2025-12-11 06:07:23.058 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:23 np0005554845 nova_compute[187128]: 2025-12-11 06:07:23.059 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:07:23 np0005554845 nova_compute[187128]: 2025-12-11 06:07:23.059 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:24 np0005554845 nova_compute[187128]: 2025-12-11 06:07:24.863 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:25 np0005554845 nova_compute[187128]: 2025-12-11 06:07:25.468 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:26.219 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:26.220 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:26.221 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:27.403 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '6'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:29 np0005554845 nova_compute[187128]: 2025-12-11 06:07:29.864 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.099 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000000a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'hostId': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.100 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.102 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for d29187d8-59e6-4e5a-aef7-97fef6cf24c7 / tapaee944ef-3d inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.102 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.incoming.packets volume: 38 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8e125c48-a18d-4c3f-a08f-86cb2340318c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 38, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.100106', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'acf99c10-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': 'c9d045cc96e00708c0474d658a9209b5bb8fac9aebba43de2129f3cf7e7f90bf'}]}, 'timestamp': '2025-12-11 06:07:30.102546', '_unique_id': '3ef836141bdd4a0690ba1f3d80d74b73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.103 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cb49bd42-794e-431e-977d-1676b3be1190', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.104185', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'acf9e76a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '1333f23a042e6f0a87b3816de67c1f9e29a8ca0ce65091e99aba9787a2587b88'}]}, 'timestamp': '2025-12-11 06:07:30.104485', '_unique_id': '3657cae4b2624682a4b47c7b743a8ccc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.104 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.116 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.usage volume: 30015488 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.117 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '574a4f54-2675-4865-b4a4-4927498a495b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30015488, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.105740', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'acfbd1a6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': '73bc1087376c240226e0cf06f84ba5838353ef0a1c201ad366f695df3eeebb7b'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.105740', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'acfbdc1e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': 'a92b98b661349283a4b89b8f23998cfbde63d79080f00ed45fa2a1a0e9693f28'}]}, 'timestamp': '2025-12-11 06:07:30.117229', '_unique_id': '06cbf31dee1749f98fb0a82159f0e840'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.118 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '508ebef3-0b98-4a04-ac4b-78953550ba84', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.118790', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'acfc2232-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '5cfe14ffaa1446b70466a21f925dc8ad27a0a5bbbba2a6202c51b3af891e1366'}]}, 'timestamp': '2025-12-11 06:07:30.119028', '_unique_id': '6f97bcd05e4d43169ea005beabb4592b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.120 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.139 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/cpu volume: 11440000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97aa8f2c-969f-4447-bcd7-d99960d8f41f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11440000000, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'timestamp': '2025-12-11T06:07:30.120158', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'acff5c4a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.841233346, 'message_signature': '72c7040b71230c5e3d898671b452803443f569a45896a3a9eedcf02db838ae46'}]}, 'timestamp': '2025-12-11 06:07:30.140249', '_unique_id': '1384774de1f44844bd95854e58c9e895'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.141 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '49cbce49-5e2b-4820-8694-33ccdf9294c2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.142102', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'acffb0fa-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '10262d0f4b9e89f3059a4ea7dd035805871ffc3d9fa6023cf7dfa30633465d62'}]}, 'timestamp': '2025-12-11 06:07:30.142348', '_unique_id': '502878ec930c4a928670a8a81ca79ff5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.143 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '068c96d7-3b7d-4b63-910b-660fac45b50f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.143499', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'acffe70a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': 'e0c2b2f03c8f57e6a31794d484b229f3b381965aa666e10ea692dd62ecca64b2'}]}, 'timestamp': '2025-12-11 06:07:30.143728', '_unique_id': 'b672625209244c719097afaf7976e222'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.144 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>]
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>]
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>]
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1186378686>]
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.145 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.146 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.146 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d8205d4-bf56-4576-a2b0-28e3e3a7fd47', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.145986', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad004970-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': '3767eae84ee569c114cc8b87ca984646c98230cf6bec144ed29ec65f141661de'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.145986', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad00544c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': 'af852de3c14e7689e838db1c4cd306d3be46533d338f1f8aae068978bb574c10'}]}, 'timestamp': '2025-12-11 06:07:30.146549', '_unique_id': '79924f4e2a9b4614a2e0e05b87cc35ae'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.147 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.172 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.bytes volume: 72962048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.173 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e313ad17-6107-47e1-a6e8-ce910e47493a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72962048, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.147724', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad046dac-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'b6d3d6d66df6e34fbcd94ae239e2a6d1ba7d1c875b9d77381a91c5cee9b59a4d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.147724', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad047cde-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'b252d18f7eccc6941b7c23ad7dac2df92991df51015ca3734ad9ed7a29def891'}]}, 'timestamp': '2025-12-11 06:07:30.173818', '_unique_id': '561ada8e076745748bc7b4282e63ae53'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.175 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.176 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.176 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.incoming.bytes volume: 6796 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '276174a5-9c53-4192-b8f8-a43867038c97', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 6796, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.176105', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'ad04e322-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '04cf04949823c0c51887cdef6f433991dbf2bac5dc6df32bbc4033026b4ee122'}]}, 'timestamp': '2025-12-11 06:07:30.176480', '_unique_id': '52ab9a377c304f86b5515cd677446fa0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.177 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.178 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/memory.usage volume: 42.84375 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '55be2cab-5572-4a81-9c7e-139dbe37de33', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 42.84375, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'timestamp': '2025-12-11T06:07:30.178071', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'ad05305c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.841233346, 'message_signature': '0e6ef19452232396a07a2ba38d35fd3612e3b19bf4ec3867e52339e66a402adf'}]}, 'timestamp': '2025-12-11 06:07:30.178495', '_unique_id': '8073350d7ee647478b2b5ade0c9120e8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.179 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.180 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.180 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.requests volume: 1104 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.180 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ca1673d4-dd9c-48f2-a0c8-fba9f314aad5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1104, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.180127', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad057fb2-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': '7fe099d82226dfb6ad3a09c0b6d6bd8a1921f3a828bdfa81e5049c55bc3f4af3'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.180127', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad058c28-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'ac9d9e24eb8ca38bccb998e396fee2d819af70cae00d5bf98ef6a17aad343b87'}]}, 'timestamp': '2025-12-11 06:07:30.180751', '_unique_id': '8cab316e6a5a41a9a8a2d69f25ea5c3c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.182 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.bytes volume: 30624256 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.183 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8625e9c4-75e8-4be3-901e-21329e102dca', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30624256, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.182672', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad05e466-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': '510043c88e74ae4d1dc2ef5296c82e87755b6d6a78c5a3d50ebf01998eade40a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.182672', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad05f30c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'e3ac0b9d37c66d259b9d56b8a2cddaf80a1cb8cf28653a40c1cba66c43364459'}]}, 'timestamp': '2025-12-11 06:07:30.183479', '_unique_id': 'c417f6db27bb47dca56a14c133ae0c43'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.185 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.185 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.requests volume: 299 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.185 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0b350532-778c-4416-ac69-84689c427e3b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 299, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.185157', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad06453c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': '7e97c64b530d9bf43dc35512e74c0eb766d4de5abaccb2fa69f1bd330efc16f7'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.185157', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad0653a6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'a8c0e76b378c72d6dce8122b2a2c86d4547e38d93f69b1d4dd5e6ad3b4008ec0'}]}, 'timestamp': '2025-12-11 06:07:30.185864', '_unique_id': 'da2c113fb3074989ab9629a6ad98f3cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.187 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.outgoing.packets volume: 41 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7a0c8fca-3e80-4a55-86a7-06029b5f1da1', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 41, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.187596', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'ad06a388-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '8c9070b4cbebfee7a79f56b663ade800206b82f7432eecbf19554a9d93bfd99f'}]}, 'timestamp': '2025-12-11 06:07:30.187925', '_unique_id': '542089887c2a4d45a25fe73692d9356e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.189 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.latency volume: 6161034711 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.189 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '124b6ba5-c753-49fd-ab05-753b9c6c9fa0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 6161034711, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.189553', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad06efc8-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': '406f4f7b9cc53aaa9af9d2e4511a454cadbdd1c6ee6e36309ab7c734fde86e80'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.189553', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad06fb4e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'b9096ee928f9a4b8a6096fe4c910fb900206fa317566ab31164f61c372276455'}]}, 'timestamp': '2025-12-11 06:07:30.190160', '_unique_id': '2105a631ef7d4beba5efac54f064bf9a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.191 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.191 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.192 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '443fd9fc-8df1-4da8-b565-0044461c539c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.191816', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad074888-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': 'f1794621d6aeed899673c3a8cd24217927ed40e4778361608d01413fbdc84a37'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.191816', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad075404-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.80741207, 'message_signature': 'f5b5d84eccebbac0836b9c438906bd7ee095843d584c3c48463b874cc5c16587'}]}, 'timestamp': '2025-12-11 06:07:30.192449', '_unique_id': '4b96324232dd4b1d94c6b10e16eb65c2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.194 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.outgoing.bytes volume: 5314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '21628789-e879-4bb9-b0c6-d5e4edf6ce6b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5314, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.194159', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'ad07a3dc-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '4355a193026cf09626682530fe2292a1f3bcdcf2be314c1c132bad3990bb2747'}]}, 'timestamp': '2025-12-11 06:07:30.194509', '_unique_id': '1edb3a3b8d374a469a02917e42b61d9d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.196 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ebf5a6b-6994-4d48-9e60-1ee0a47e39af', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.196281', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'ad07fd6e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '77d487f9774024c3e51c4b7ab488b3de92511b4af4dfcd7863b1f48d06f184ca'}]}, 'timestamp': '2025-12-11 06:07:30.196777', '_unique_id': 'd2530ff33dfc4fb78b6fd9fe80002bcb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.198 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.198 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.latency volume: 204010901 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.198 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.device.read.latency volume: 29545574 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '14da4c98-3115-4fd1-86d0-e4d7b0b3e544', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 204010901, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-vda', 'timestamp': '2025-12-11T06:07:30.198476', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'ad084c4c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': '5f3e210bb678ba9d725c4493f6359943b121ab72c5e9dbb48abee0e184059176'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 29545574, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7-sda', 'timestamp': '2025-12-11T06:07:30.198476', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'instance-0000000a', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'ad08573c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.849407338, 'message_signature': 'f90ac77ee5aeb5a1a83e8e69e7d1b7cdb062d93d5fef971ffd15be11cbccc5d5'}]}, 'timestamp': '2025-12-11 06:07:30.199054', '_unique_id': 'b80a41f44a9c45b8ade219a36024b07f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.200 12 DEBUG ceilometer.compute.pollsters [-] d29187d8-59e6-4e5a-aef7-97fef6cf24c7/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c4d34e0e-d4e9-4c4e-9887-427c06824069', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-0000000a-d29187d8-59e6-4e5a-aef7-97fef6cf24c7-tapaee944ef-3d', 'timestamp': '2025-12-11T06:07:30.200940', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1186378686', 'name': 'tapaee944ef-3d', 'instance_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:39:5c:9d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapaee944ef-3d'}, 'message_id': 'ad08acaa-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3524.801774957, 'message_signature': '03d61718211b9181010f2cb73cade340d799b857fb03fd96b1e932dbae397a75'}]}, 'timestamp': '2025-12-11 06:07:30.201260', '_unique_id': '76f554fd50a145daadbba6a6e24bfad5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:07:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:07:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:07:30 np0005554845 nova_compute[187128]: 2025-12-11 06:07:30.470 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.292 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.442 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.461 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.461 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.461 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.462 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.462 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.462 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.463 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:32 np0005554845 nova_compute[187128]: 2025-12-11 06:07:32.463 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:07:33 np0005554845 nova_compute[187128]: 2025-12-11 06:07:33.204 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:33 np0005554845 nova_compute[187128]: 2025-12-11 06:07:33.204 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:33 np0005554845 nova_compute[187128]: 2025-12-11 06:07:33.205 187132 DEBUG nova.network.neutron [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:07:34 np0005554845 nova_compute[187128]: 2025-12-11 06:07:34.459 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:07:34 np0005554845 nova_compute[187128]: 2025-12-11 06:07:34.865 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:35 np0005554845 nova_compute[187128]: 2025-12-11 06:07:35.473 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:35 np0005554845 nova_compute[187128]: 2025-12-11 06:07:35.489 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Creating tmpfile /var/lib/nova/instances/tmp4zbj72ew to notify to other compute nodes that they should mount the same storage. _create_shared_storage_test_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10041#033[00m
Dec 11 01:07:35 np0005554845 nova_compute[187128]: 2025-12-11 06:07:35.709 187132 DEBUG nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] destination check data is LibvirtLiveMigrateData(bdms=<?>,block_migration=<?>,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zbj72ew',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path=<?>,is_shared_block_storage=<?>,is_shared_instance_path=<?>,is_volume_backed=<?>,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) check_can_live_migrate_destination /usr/lib/python3.9/site-packages/nova/compute/manager.py:8476#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.005 187132 DEBUG nova.network.neutron [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.033 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:37 np0005554845 podman[215500]: 2025-12-11 06:07:37.143348853 +0000 UTC m=+0.072001632 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.150 187132 DEBUG nova.virt.libvirt.driver [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Starting migrate_disk_and_power_off migrate_disk_and_power_off /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11511#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.150 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Creating file /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/fc345a514a17465eb32a0854a3bb16e9.tmp on remote host 192.168.122.101 create_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:79#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.150 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/fc345a514a17465eb32a0854a3bb16e9.tmp execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.271 187132 DEBUG nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] pre_live_migration data is LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zbj72ew',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e259711f-fca8-4dd1-9fd0-b49e0404776f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids=<?>,serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8604#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.293 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquiring lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.294 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquired lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.294 187132 DEBUG nova.network.neutron [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.788 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/fc345a514a17465eb32a0854a3bb16e9.tmp" returned: 1 in 0.638s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.789 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] 'ssh -o BatchMode=yes 192.168.122.101 touch /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/fc345a514a17465eb32a0854a3bb16e9.tmp' failed. Not Retrying. execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:473#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.790 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Creating directory /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7 on remote host 192.168.122.101 create_dir /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:91#033[00m
Dec 11 01:07:37 np0005554845 nova_compute[187128]: 2025-12-11 06:07:37.790 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:38 np0005554845 nova_compute[187128]: 2025-12-11 06:07:38.011 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "ssh -o BatchMode=yes 192.168.122.101 mkdir -p /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:38 np0005554845 nova_compute[187128]: 2025-12-11 06:07:38.016 187132 DEBUG nova.virt.libvirt.driver [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.375 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.376 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.397 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.469 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.469 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.476 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.476 187132 INFO nova.compute.claims [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.625 187132 DEBUG nova.compute.provider_tree [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.640 187132 DEBUG nova.scheduler.client.report [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.660 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.661 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.727 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.728 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.751 187132 INFO nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.771 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.865 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.866 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.866 187132 INFO nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Creating image(s)#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.866 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.867 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.867 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.879 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.881 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.938 187132 DEBUG nova.network.neutron [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Updating instance_info_cache with network_info: [{"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.946 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.947 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.948 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.959 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.980 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Releasing lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.982 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] migrate_data in pre_live_migration: LibvirtLiveMigrateData(bdms=<?>,block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zbj72ew',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e259711f-fca8-4dd1-9fd0-b49e0404776f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=<?>,src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=<?>,target_connect_addr=<?>,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10827#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.983 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Creating instance directory: /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10840#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.983 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Creating disk.info with the contents: {'/var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk': 'qcow2', '/var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config': 'raw'} pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10854#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.984 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Checking to make sure images and backing files are present before live migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10864#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.984 187132 DEBUG nova.objects.instance [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e259711f-fca8-4dd1-9fd0-b49e0404776f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:39 np0005554845 nova_compute[187128]: 2025-12-11 06:07:39.998 187132 DEBUG nova.policy [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.023 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.024 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.044 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.069 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk 1073741824" returned: 0 in 0.045s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.070 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.071 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.130 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.131 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.132 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.143 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.159 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.160 187132 DEBUG nova.virt.disk.api [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.161 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.199 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.200 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.230 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.231 187132 DEBUG nova.virt.disk.api [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.231 187132 DEBUG nova.objects.instance [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:40 np0005554845 kernel: tapaee944ef-3d (unregistering): left promiscuous mode
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.244 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.246 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Ensure instance console log exists: /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:07:40 np0005554845 NetworkManager[55529]: <info>  [1765433260.2468] device (tapaee944ef-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.247 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.247 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.248 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:40Z|00064|binding|INFO|Releasing lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 from this chassis (sb_readonly=0)
Dec 11 01:07:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:40Z|00065|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 down in Southbound
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.254 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:40Z|00066|binding|INFO|Removing iface tapaee944ef-3d ovn-installed in OVS
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.260 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:5c:9d 10.100.0.14'], port_security=['fa:16:3e:39:5c:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': '25d5132c-a309-410e-93c9-7759e7948f62', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114c3962-e260-4a4f-84c2-081b45071782, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=aee944ef-3d55-4d72-85fd-0bcba5cebad9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.261 104320 INFO neutron.agent.ovn.metadata.agent [-] Port aee944ef-3d55-4d72-85fd-0bcba5cebad9 in datapath fa8f22dd-28ac-458d-9f63-a7d8a915d217 unbound from our chassis#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.264 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa8f22dd-28ac-458d-9f63-a7d8a915d217, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.264 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk 1073741824" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.265 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.133s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.265 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.265 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b9b52299-1635-4538-9958-a68b847ca2f3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.266 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 namespace which is not needed anymore#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.284 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 11 01:07:40 np0005554845 systemd[1]: machine-qemu\x2d4\x2dinstance\x2d0000000a.scope: Consumed 14.189s CPU time.
Dec 11 01:07:40 np0005554845 systemd-machined[153381]: Machine qemu-4-instance-0000000a terminated.
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.337 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.072s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.338 187132 DEBUG nova.virt.disk.api [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Checking if we can resize image /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.338 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.392 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.393 187132 DEBUG nova.virt.disk.api [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Cannot resize image /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.394 187132 DEBUG nova.objects.instance [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lazy-loading 'migration_context' on Instance uuid e259711f-fca8-4dd1-9fd0-b49e0404776f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.412 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config 485376 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.438 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f raw /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config 485376" returned: 0 in 0.026s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.440 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Copying file compute-0.ctlplane.example.com:/var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config to /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.441 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Running cmd (subprocess): scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.463 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.473 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.475 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.477 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.546 187132 DEBUG nova.compute.manager [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.546 187132 DEBUG oslo_concurrency.lockutils [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.547 187132 DEBUG oslo_concurrency.lockutils [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.547 187132 DEBUG oslo_concurrency.lockutils [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.547 187132 DEBUG nova.compute.manager [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.547 187132 WARNING nova.compute.manager [req-3a228803-3546-426a-8033-d61901a51e93 req-4eecefc4-428e-4363-a673-05506f032195 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state active and task_state resize_migrating.#033[00m
Dec 11 01:07:40 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [NOTICE]   (215310) : haproxy version is 2.8.14-c23fe91
Dec 11 01:07:40 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [NOTICE]   (215310) : path to executable is /usr/sbin/haproxy
Dec 11 01:07:40 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [WARNING]  (215310) : Exiting Master process...
Dec 11 01:07:40 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [ALERT]    (215310) : Current worker (215312) exited with code 143 (Terminated)
Dec 11 01:07:40 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[215306]: [WARNING]  (215310) : All workers exited. Exiting... (0)
Dec 11 01:07:40 np0005554845 systemd[1]: libpod-19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4.scope: Deactivated successfully.
Dec 11 01:07:40 np0005554845 podman[215578]: 2025-12-11 06:07:40.826897589 +0000 UTC m=+0.479188256 container died 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:07:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4-userdata-shm.mount: Deactivated successfully.
Dec 11 01:07:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay-d0a0c4dccc1e1d13e6fe6ee229bfb549d2a5b8cedb6537a24f030d6c3608ac6a-merged.mount: Deactivated successfully.
Dec 11 01:07:40 np0005554845 podman[215578]: 2025-12-11 06:07:40.869805372 +0000 UTC m=+0.522096019 container cleanup 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:40 np0005554845 systemd[1]: libpod-conmon-19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4.scope: Deactivated successfully.
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.927 187132 DEBUG oslo_concurrency.processutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] CMD "scp -C -r compute-0.ctlplane.example.com:/var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f/disk.config /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f" returned: 0 in 0.486s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.928 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Plugging VIFs using destination host port bindings before live migration. _pre_live_migration_plug_vifs /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10794#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.930 187132 DEBUG nova.virt.libvirt.vif [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-2099766739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-0.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-2099766739',id=12,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-0.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='7936cace634747e4997212d1e4422555',ramdisk_id='',reservation_id='r-z9l0w2g1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1239256349',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1239256349-project-member'},tags=<?>,task_state='migrating',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:16Z,user_data=None,user_id='dc2400e30fa0477abb781abef37fc5a4',uuid=e259711f-fca8-4dd1-9fd0-b49e0404776f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.930 187132 DEBUG nova.network.os_vif_util [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Converting VIF {"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system"}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.932 187132 DEBUG nova.network.os_vif_util [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.932 187132 DEBUG os_vif [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.933 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.934 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.934 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.938 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.938 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc70283ea-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.939 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapc70283ea-f0, col_values=(('external_ids', {'iface-id': 'c70283ea-f020-4b95-96ff-d6995a36ba20', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:8d:41:0e', 'vm-uuid': 'e259711f-fca8-4dd1-9fd0-b49e0404776f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.940 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 NetworkManager[55529]: <info>  [1765433260.9419] manager: (tapc70283ea-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/43)
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.945 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.951 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.953 187132 INFO os_vif [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0')#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.953 187132 DEBUG nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] No dst_numa_info in migrate_data, no cores to power up in pre_live_migration. pre_live_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:10954#033[00m
Dec 11 01:07:40 np0005554845 podman[215630]: 2025-12-11 06:07:40.954165417 +0000 UTC m=+0.063390408 container remove 19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.954 187132 DEBUG nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] driver pre_live_migration data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zbj72ew',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e259711f-fca8-4dd1-9fd0-b49e0404776f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=<?>) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8668#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.959 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3c1a9997-409a-4b08-981d-28772048ca4c]: (4, ('Thu Dec 11 06:07:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 (19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4)\n19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4\nThu Dec 11 06:07:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 (19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4)\n19e05919dcfec44804f6d2500a4f56d5ecd9e589a058bca70ff7229c91a14ae4\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.962 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[25bb0ddf-5bc0-4cf4-93b1-0be8f6055c70]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.964 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa8f22dd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:40 np0005554845 kernel: tapfa8f22dd-20: left promiscuous mode
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.968 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 nova_compute[187128]: 2025-12-11 06:07:40.989 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:40.993 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[be96d7a9-0581-42ad-8e32-1425e873da20]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:41 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:41.023 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b68a7e0c-148f-4a37-8946-6620ef4f6263]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:41 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:41.025 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[10a76d75-2247-47fa-9b08-43c5ae84fb9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:41 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:41.045 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3197381f-a26b-4894-8d96-853e1a21a267]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 349117, 'reachable_time': 38951, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215655, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:41 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:41.049 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:07:41 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:41.049 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[0b7b4dc2-2c1a-4e0a-adb1-2fa76a4e3ed1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:41 np0005554845 systemd[1]: run-netns-ovnmeta\x2dfa8f22dd\x2d28ac\x2d458d\x2d9f63\x2da7d8a915d217.mount: Deactivated successfully.
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.095 187132 INFO nova.virt.libvirt.driver [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance shutdown successfully after 3 seconds.#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.106 187132 INFO nova.virt.libvirt.driver [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance destroyed successfully.#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.107 187132 DEBUG nova.virt.libvirt.vif [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:06:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=ServiceList,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='resize_migrating',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:32Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1045645596", "vif_mac": "fa:16:3e:39:5c:9d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.108 187132 DEBUG nova.network.os_vif_util [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}], "label": "tempest-network-smoke--1045645596", "vif_mac": "fa:16:3e:39:5c:9d"}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.109 187132 DEBUG nova.network.os_vif_util [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.109 187132 DEBUG os_vif [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.112 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.113 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaee944ef-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.114 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.116 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.118 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.121 187132 INFO os_vif [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d')#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.125 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.190 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.191 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.261 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.263 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk to 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.263 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.848 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Successfully created port: fb8865d1-91e3-4d6a-9437-231beabc5816 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.871 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk" returned: 0 in 0.608s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.872 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.config to 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:07:41 np0005554845 nova_compute[187128]: 2025-12-11 06:07:41.873 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.386 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -C -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.config 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config" returned: 0 in 0.513s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.387 187132 DEBUG nova.virt.libvirt.volume.remotefs [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Copying file /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.info to 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info copy_file /usr/lib/python3.9/site-packages/nova/virt/libvirt/volume/remotefs.py:103#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.388 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Running cmd (subprocess): scp -C -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.609 187132 DEBUG oslo_concurrency.processutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] CMD "scp -C -r /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_resize/disk.info 192.168.122.101:/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" returned: 0 in 0.221s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.853 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Successfully created port: 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:07:42 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.882 187132 DEBUG neutronclient.v2_0.client [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Error message: {"NeutronError": {"type": "PortBindingNotFound", "message": "Binding for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 for host compute-1.ctlplane.example.com could not be found.", "detail": ""}} _handle_fault_response /usr/lib/python3.9/site-packages/neutronclient/v2_0/client.py:262#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:42.999 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.000 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.000 187132 DEBUG oslo_concurrency.lockutils [None req-5c258a4d-e54b-45d6-9d08-de59f1fe28df e928945f58ed4b1ba56f95cc823968c9 e2153b15898e47968bb4aab7e5094539 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:43 np0005554845 podman[215669]: 2025-12-11 06:07:43.130151013 +0000 UTC m=+0.060483360 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.424 187132 DEBUG nova.compute.manager [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.425 187132 DEBUG oslo_concurrency.lockutils [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.425 187132 DEBUG oslo_concurrency.lockutils [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.425 187132 DEBUG oslo_concurrency.lockutils [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.425 187132 DEBUG nova.compute.manager [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.425 187132 WARNING nova.compute.manager [req-e2c0c01d-adec-4450-9646-c59e16787c3d req-17da8122-3768-4316-8c07-45340c203f81 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state active and task_state resize_migrated.#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.721 187132 DEBUG nova.network.neutron [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Port c70283ea-f020-4b95-96ff-d6995a36ba20 updated with migration profile {'migrating_to': 'compute-2.ctlplane.example.com'} successfully _setup_migration_port_profile /usr/lib/python3.9/site-packages/nova/network/neutron.py:354#033[00m
Dec 11 01:07:43 np0005554845 nova_compute[187128]: 2025-12-11 06:07:43.722 187132 DEBUG nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] pre_live_migration result data is LibvirtLiveMigrateData(bdms=[],block_migration=True,disk_available_mb=73728,disk_over_commit=<?>,dst_numa_info=<?>,dst_supports_numa_live_migration=<?>,dst_wants_file_backed_memory=False,file_backed_memory_discard=<?>,filename='tmp4zbj72ew',graphics_listen_addr_spice=127.0.0.1,graphics_listen_addr_vnc=::,image_type='qcow2',instance_relative_path='e259711f-fca8-4dd1-9fd0-b49e0404776f',is_shared_block_storage=False,is_shared_instance_path=False,is_volume_backed=False,migration=<?>,old_vol_attachment_ids={},serial_listen_addr=None,serial_listen_ports=[],src_supports_native_luks=<?>,src_supports_numa_live_migration=<?>,supported_perf_events=[],target_connect_addr=None,vifs=[VIFMigrateData],wait_for_vif_plugged=True) pre_live_migration /usr/lib/python3.9/site-packages/nova/compute/manager.py:8723#033[00m
Dec 11 01:07:43 np0005554845 systemd[1]: Starting libvirt proxy daemon...
Dec 11 01:07:43 np0005554845 systemd[1]: Started libvirt proxy daemon.
Dec 11 01:07:44 np0005554845 kernel: tapc70283ea-f0: entered promiscuous mode
Dec 11 01:07:44 np0005554845 NetworkManager[55529]: <info>  [1765433264.0421] manager: (tapc70283ea-f0): new Tun device (/org/freedesktop/NetworkManager/Devices/44)
Dec 11 01:07:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:44Z|00067|binding|INFO|Claiming lport c70283ea-f020-4b95-96ff-d6995a36ba20 for this additional chassis.
Dec 11 01:07:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:44Z|00068|binding|INFO|c70283ea-f020-4b95-96ff-d6995a36ba20: Claiming fa:16:3e:8d:41:0e 10.100.0.10
Dec 11 01:07:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:44Z|00069|binding|INFO|Claiming lport 5877ffb5-7529-4bf6-bc7f-c3f20519f897 for this additional chassis.
Dec 11 01:07:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:44Z|00070|binding|INFO|5877ffb5-7529-4bf6-bc7f-c3f20519f897: Claiming fa:16:3e:ab:dd:f9 19.80.0.185
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.044 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:44 np0005554845 systemd-udevd[215725]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.092 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:44 np0005554845 systemd-machined[153381]: New machine qemu-5-instance-0000000c.
Dec 11 01:07:44 np0005554845 NetworkManager[55529]: <info>  [1765433264.1018] device (tapc70283ea-f0): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:07:44 np0005554845 NetworkManager[55529]: <info>  [1765433264.1029] device (tapc70283ea-f0): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.104 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:44 np0005554845 systemd[1]: Started Virtual Machine qemu-5-instance-0000000c.
Dec 11 01:07:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:44Z|00071|binding|INFO|Setting lport c70283ea-f020-4b95-96ff-d6995a36ba20 ovn-installed in OVS
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.428 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.663 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.664 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.685 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.798 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433264.7979362, e259711f-fca8-4dd1-9fd0-b49e0404776f => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.798 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] VM Started (Lifecycle Event)#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.817 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.818 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.830 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.830 187132 INFO nova.compute.claims [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.834 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:44 np0005554845 nova_compute[187128]: 2025-12-11 06:07:44.869 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:45 np0005554845 nova_compute[187128]: 2025-12-11 06:07:45.003 187132 DEBUG nova.compute.provider_tree [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:07:45 np0005554845 podman[215742]: 2025-12-11 06:07:45.149338019 +0000 UTC m=+0.065165257 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:45 np0005554845 podman[215743]: 2025-12-11 06:07:45.215173773 +0000 UTC m=+0.131151655 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:45 np0005554845 nova_compute[187128]: 2025-12-11 06:07:45.329 187132 DEBUG nova.scheduler.client.report [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.114 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.166 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 1.348s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.167 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.269 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.270 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.287 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433266.2871773, e259711f-fca8-4dd1-9fd0-b49e0404776f => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.288 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.305 187132 INFO nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.315 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.320 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: migrating, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.325 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.358 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] During the sync_power process the instance has moved from host compute-0.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.421 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.422 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.423 187132 INFO nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Creating image(s)#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.423 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.424 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.424 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.437 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.511 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.512 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.513 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.523 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.586 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.588 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.870 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Successfully updated port: fb8865d1-91e3-4d6a-9437-231beabc5816 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.906 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk 1073741824" returned: 0 in 0.318s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.907 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.394s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.907 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.983 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.984 187132 DEBUG nova.virt.disk.api [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Checking if we can resize image /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:07:46 np0005554845 nova_compute[187128]: 2025-12-11 06:07:46.985 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.039 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.041 187132 DEBUG nova.virt.disk.api [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Cannot resize image /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.042 187132 DEBUG nova.objects.instance [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lazy-loading 'migration_context' on Instance uuid d2953461-e3c8-4475-978e-99fe1b807179 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.085 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.086 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Ensure instance console log exists: /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.087 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.088 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.089 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.952 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Successfully updated port: 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.965 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.965 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:47 np0005554845 nova_compute[187128]: 2025-12-11 06:07:47.966 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:07:48 np0005554845 podman[215816]: 2025-12-11 06:07:48.134274466 +0000 UTC m=+0.061544919 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:48 np0005554845 nova_compute[187128]: 2025-12-11 06:07:48.184 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:07:48 np0005554845 nova_compute[187128]: 2025-12-11 06:07:48.469 187132 DEBUG nova.compute.manager [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:48 np0005554845 nova_compute[187128]: 2025-12-11 06:07:48.470 187132 DEBUG nova.compute.manager [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing instance network info cache due to event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:07:48 np0005554845 nova_compute[187128]: 2025-12-11 06:07:48.471 187132 DEBUG oslo_concurrency.lockutils [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:48 np0005554845 nova_compute[187128]: 2025-12-11 06:07:48.664 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Successfully created port: 0a706dcf-eb29-4098-946a-e1a25e5587a8 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00072|binding|INFO|Claiming lport c70283ea-f020-4b95-96ff-d6995a36ba20 for this chassis.
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00073|binding|INFO|c70283ea-f020-4b95-96ff-d6995a36ba20: Claiming fa:16:3e:8d:41:0e 10.100.0.10
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00074|binding|INFO|Claiming lport 5877ffb5-7529-4bf6-bc7f-c3f20519f897 for this chassis.
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00075|binding|INFO|5877ffb5-7529-4bf6-bc7f-c3f20519f897: Claiming fa:16:3e:ab:dd:f9 19.80.0.185
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00076|binding|INFO|Setting lport c70283ea-f020-4b95-96ff-d6995a36ba20 up in Southbound
Dec 11 01:07:48 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:48Z|00077|binding|INFO|Setting lport 5877ffb5-7529-4bf6-bc7f-c3f20519f897 up in Southbound
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.848 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:dd:f9 19.80.0.185'], port_security=['fa:16:3e:ab:dd:f9 19.80.0.185'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': ''}, parent_port=['c70283ea-f020-4b95-96ff-d6995a36ba20'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1138526966', 'neutron:cidrs': '19.80.0.185/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1138526966', 'neutron:project_id': '7936cace634747e4997212d1e4422555', 'neutron:revision_number': '3', 'neutron:security_group_ids': 'b088e580-6b47-485c-9cd1-4a797e9267e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52455e06-6286-47a1-bae7-41dc34cce60e, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5877ffb5-7529-4bf6-bc7f-c3f20519f897) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.850 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:41:0e 10.100.0.10'], port_security=['fa:16:3e:8d:41:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[True], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-812359412', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e259711f-fca8-4dd1-9fd0-b49e0404776f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-812359412', 'neutron:project_id': '7936cace634747e4997212d1e4422555', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'b088e580-6b47-485c-9cd1-4a797e9267e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b97d8aa-5e3a-46a8-a957-f048dac2ebf8, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=c70283ea-f020-4b95-96ff-d6995a36ba20) old=Port_Binding(up=[False], additional_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.851 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 5877ffb5-7529-4bf6-bc7f-c3f20519f897 in datapath dcd4b1db-0d1a-44e0-b910-2ed7106fc09e bound to our chassis#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.852 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network dcd4b1db-0d1a-44e0-b910-2ed7106fc09e#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.870 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[58f37ea1-7df7-4b27-88e0-bf5dd0254645]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.871 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapdcd4b1db-01 in ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.874 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapdcd4b1db-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.874 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[62fbc756-c838-4736-9ba1-5e449feaac88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.876 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[46348230-65d9-40bb-9ac6-e27f70ee784e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.888 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[91ac7397-86e3-416c-8ae1-919ddb40f6de]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.920 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[16d2ea15-e9af-4ecf-87c2-2c9ab66aebb1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.955 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[7f0ebc91-f499-4b25-bc72-cfdf66b6d140]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 NetworkManager[55529]: <info>  [1765433268.9608] manager: (tapdcd4b1db-00): new Veth device (/org/freedesktop/NetworkManager/Devices/45)
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.960 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cc37ecb8-aa56-4139-be13-88755265be4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.987 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[d86817d2-4d78-4c9e-b24d-8cde496e6652]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:48 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:48.990 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[20457d80-dd7b-437c-92c7-0742ba0baadb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 systemd-udevd[215846]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:07:49 np0005554845 NetworkManager[55529]: <info>  [1765433269.0135] device (tapdcd4b1db-00): carrier: link connected
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.017 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a2759d-3992-4028-8d22-55a3fb121522]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.037 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d7bf7183-8a6d-46f0-842e-245b1475554e]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcd4b1db-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b3:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354365, 'reachable_time': 37822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215848, 'error': None, 'target': 'ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.054 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[40e0cd46-2304-4301-843d-adcf5ce12508]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fedd:b350'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354365, 'tstamp': 354365}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215865, 'error': None, 'target': 'ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.074 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8bec504f-cf93-4f45-b3ad-73fde031772a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapdcd4b1db-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:dd:b3:50'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 24], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354365, 'reachable_time': 37822, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215866, 'error': None, 'target': 'ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.105 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[462e4a0d-d036-4d50-af47-7484ae4837c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.174 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cdf59b1e-41de-46c3-97c9-0fbc99d3c4a5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.175 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd4b1db-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.176 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.176 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapdcd4b1db-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:49 np0005554845 NetworkManager[55529]: <info>  [1765433269.1783] manager: (tapdcd4b1db-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/46)
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.177 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 kernel: tapdcd4b1db-00: entered promiscuous mode
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.180 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.182 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapdcd4b1db-00, col_values=(('external_ids', {'iface-id': 'c46efaa2-51f3-49f2-88d1-957ae0b2127e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.183 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:49Z|00078|binding|INFO|Releasing lport c46efaa2-51f3-49f2-88d1-957ae0b2127e from this chassis (sb_readonly=0)
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.185 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.186 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/dcd4b1db-0d1a-44e0-b910-2ed7106fc09e.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/dcd4b1db-0d1a-44e0-b910-2ed7106fc09e.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.187 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[790d2af3-352e-49c0-8ab9-f5a8d9f44c23]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.189 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/dcd4b1db-0d1a-44e0-b910-2ed7106fc09e.pid.haproxy
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID dcd4b1db-0d1a-44e0-b910-2ed7106fc09e
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.191 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'env', 'PROCESS_TAG=haproxy-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/dcd4b1db-0d1a-44e0-b910-2ed7106fc09e.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.201 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 podman[215899]: 2025-12-11 06:07:49.588690148 +0000 UTC m=+0.067160261 container create d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, org.label-schema.vendor=CentOS, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:07:49 np0005554845 systemd[1]: Started libpod-conmon-d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212.scope.
Dec 11 01:07:49 np0005554845 podman[215899]: 2025-12-11 06:07:49.553308529 +0000 UTC m=+0.031778662 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:07:49 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:07:49 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e12cb972cb79968f308fe5e78b1db125dc616fbac33687c55cf48a1ec3c94ca8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:07:49 np0005554845 podman[215899]: 2025-12-11 06:07:49.677039362 +0000 UTC m=+0.155509495 container init d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:49 np0005554845 podman[215899]: 2025-12-11 06:07:49.683624971 +0000 UTC m=+0.162095084 container start d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0)
Dec 11 01:07:49 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [NOTICE]   (215919) : New worker (215921) forked
Dec 11 01:07:49 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [NOTICE]   (215919) : Loading success.
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.763 104320 INFO neutron.agent.ovn.metadata.agent [-] Port c70283ea-f020-4b95-96ff-d6995a36ba20 in datapath 88c1a45b-56e2-4aa6-a974-6011ef55c52b unbound from our chassis#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.769 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 88c1a45b-56e2-4aa6-a974-6011ef55c52b#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.781 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[316731e8-0040-49a3-b2ac-c6f7863f8f16]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.782 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap88c1a45b-51 in ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.784 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap88c1a45b-50 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.784 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2cfdea9e-8122-42dd-8e71-bd35d4edee23]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.785 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[598069c9-dc4d-4667-82d8-d57fa14a4d56]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.805 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[f2433099-dee5-496f-820d-02e281abfccd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.834 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b84d5760-33bd-4144-9eb0-e0a00f9b93d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.857 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[cd286412-025a-4e0f-aff7-6f52509d0204]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 NetworkManager[55529]: <info>  [1765433269.8699] manager: (tap88c1a45b-50): new Veth device (/org/freedesktop/NetworkManager/Devices/47)
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.869 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a8625343-19fb-4b10-90eb-694da2404007]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 nova_compute[187128]: 2025-12-11 06:07:49.871 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:49 np0005554845 systemd-udevd[215852]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.904 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[b0bb9cad-1121-40f9-92e5-7c560bf0eba7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.908 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[405b361d-5223-4ce9-81ba-fc513131c67a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 NetworkManager[55529]: <info>  [1765433269.9331] device (tap88c1a45b-50): carrier: link connected
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.939 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[558ece97-b5a8-4c02-b783-90d5cbab54ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.967 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[21213f19-4a13-4028-a819-685cc5f660ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88c1a45b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:d2:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354457, 'reachable_time': 35365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 215942, 'error': None, 'target': 'ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:49.989 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4b43a069-065e-4662-9fb6-dd7f8c1379d4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:d236'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354457, 'tstamp': 354457}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 215943, 'error': None, 'target': 'ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.010 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1318f227-2282-4b4f-9d09-4077403db236]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap88c1a45b-51'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:d2:36'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 25], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354457, 'reachable_time': 35365, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 215944, 'error': None, 'target': 'ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.048 187132 INFO nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Post operation of migration started#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.048 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2e5f27ee-3cef-463e-9631-30353fe3e319]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.118 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cbf54a11-4de9-412c-9f73-280d20bd3d3d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.121 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c1a45b-50, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.121 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.122 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap88c1a45b-50, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.123 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:50 np0005554845 NetworkManager[55529]: <info>  [1765433270.1248] manager: (tap88c1a45b-50): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/48)
Dec 11 01:07:50 np0005554845 kernel: tap88c1a45b-50: entered promiscuous mode
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.126 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.127 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap88c1a45b-50, col_values=(('external_ids', {'iface-id': 'b370d7e1-99df-455c-a61a-d60c4eb58c3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.128 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:50Z|00079|binding|INFO|Releasing lport b370d7e1-99df-455c-a61a-d60c4eb58c3f from this chassis (sb_readonly=0)
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.140 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.141 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/88c1a45b-56e2-4aa6-a974-6011ef55c52b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/88c1a45b-56e2-4aa6-a974-6011ef55c52b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.142 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d560301-f398-49c3-b9e9-4ec64fb5ea9f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.143 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-88c1a45b-56e2-4aa6-a974-6011ef55c52b
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/88c1a45b-56e2-4aa6-a974-6011ef55c52b.pid.haproxy
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 88c1a45b-56e2-4aa6-a974-6011ef55c52b
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:07:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:50.143 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'env', 'PROCESS_TAG=haproxy-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/88c1a45b-56e2-4aa6-a974-6011ef55c52b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:07:50 np0005554845 podman[215976]: 2025-12-11 06:07:50.526221554 +0000 UTC m=+0.061879369 container create c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.561 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquiring lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.561 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquired lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.561 187132 DEBUG nova.network.neutron [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:07:50 np0005554845 systemd[1]: Started libpod-conmon-c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108.scope.
Dec 11 01:07:50 np0005554845 podman[215976]: 2025-12-11 06:07:50.494224826 +0000 UTC m=+0.029882721 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:07:50 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:07:50 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/59245bde2828d4faa2de49a2fbf8357d20eef2ded790bc6f110e77d97795e140/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:07:50 np0005554845 podman[215976]: 2025-12-11 06:07:50.610477007 +0000 UTC m=+0.146134822 container init c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:07:50 np0005554845 podman[215976]: 2025-12-11 06:07:50.615279406 +0000 UTC m=+0.150937221 container start c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:07:50 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [NOTICE]   (215996) : New worker (215998) forked
Dec 11 01:07:50 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [NOTICE]   (215996) : Loading success.
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.911 187132 DEBUG nova.network.neutron [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.939 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.940 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Instance network_info: |[{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.940 187132 DEBUG oslo_concurrency.lockutils [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.940 187132 DEBUG nova.network.neutron [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.944 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Start _get_guest_xml network_info=[{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.948 187132 WARNING nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.953 187132 DEBUG nova.virt.libvirt.host [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.953 187132 DEBUG nova.virt.libvirt.host [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.961 187132 DEBUG nova.virt.libvirt.host [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.961 187132 DEBUG nova.virt.libvirt.host [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.963 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.963 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.963 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.963 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.964 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.964 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.964 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.964 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.965 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.965 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.965 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.965 187132 DEBUG nova.virt.hardware [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.968 187132 DEBUG nova.virt.libvirt.vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:39Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.968 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.969 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.970 187132 DEBUG nova.virt.libvirt.vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:39Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.970 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.971 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.971 187132 DEBUG nova.objects.instance [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.992 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <uuid>524e0fc6-c557-4d6d-a3bf-a9af1980bf6d</uuid>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <name>instance-0000000d</name>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-1336912788</nova:name>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:07:50</nova:creationTime>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:port uuid="fb8865d1-91e3-4d6a-9437-231beabc5816">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        <nova:port uuid="49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fefe:6acf" ipVersion="6"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="serial">524e0fc6-c557-4d6d-a3bf-a9af1980bf6d</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="uuid">524e0fc6-c557-4d6d-a3bf-a9af1980bf6d</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.config"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:49:01:48"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <target dev="tapfb8865d1-91"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:fe:6a:cf"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <target dev="tap49ac0b2b-42"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/console.log" append="off"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:07:50 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:07:50 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:07:50 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:07:50 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.993 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Preparing to wait for external event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.993 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.994 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.994 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.994 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Preparing to wait for external event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.994 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.994 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.995 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.995 187132 DEBUG nova.virt.libvirt.vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:39Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.995 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.996 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.996 187132 DEBUG os_vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.997 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.997 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:50 np0005554845 nova_compute[187128]: 2025-12-11 06:07:50.998 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.000 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.001 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfb8865d1-91, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.001 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapfb8865d1-91, col_values=(('external_ids', {'iface-id': 'fb8865d1-91e3-4d6a-9437-231beabc5816', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:49:01:48', 'vm-uuid': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.003 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.0041] manager: (tapfb8865d1-91): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/49)
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.006 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.008 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.009 187132 INFO os_vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91')#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.010 187132 DEBUG nova.virt.libvirt.vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:39Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.010 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.011 187132 DEBUG nova.network.os_vif_util [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.011 187132 DEBUG os_vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.012 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.012 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.012 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.016 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.016 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap49ac0b2b-42, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.017 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap49ac0b2b-42, col_values=(('external_ids', {'iface-id': '49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:fe:6a:cf', 'vm-uuid': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.018 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.0193] manager: (tap49ac0b2b-42): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/50)
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.020 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.026 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.027 187132 INFO os_vif [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42')#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.116 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.117 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.117 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:49:01:48, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.117 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:fe:6a:cf, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.118 187132 INFO nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Using config drive#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.423 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Successfully updated port: 0a706dcf-eb29-4098-946a-e1a25e5587a8 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.446 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.446 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquired lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.447 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.686 187132 DEBUG nova.compute.manager [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.686 187132 DEBUG nova.compute.manager [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing instance network info cache due to event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.687 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.687 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.687 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.697 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.759 187132 INFO nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Creating config drive at /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.config#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.764 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi4fmoj0c execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.891 187132 DEBUG oslo_concurrency.processutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpi4fmoj0c" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:51 np0005554845 kernel: tapfb8865d1-91: entered promiscuous mode
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.9617] manager: (tapfb8865d1-91): new Tun device (/org/freedesktop/NetworkManager/Devices/51)
Dec 11 01:07:51 np0005554845 nova_compute[187128]: 2025-12-11 06:07:51.967 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.9760] device (tapfb8865d1-91): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:07:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:51Z|00080|binding|INFO|Claiming lport fb8865d1-91e3-4d6a-9437-231beabc5816 for this chassis.
Dec 11 01:07:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:51Z|00081|binding|INFO|fb8865d1-91e3-4d6a-9437-231beabc5816: Claiming fa:16:3e:49:01:48 10.100.0.11
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.9772] device (tapfb8865d1-91): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:07:51 np0005554845 NetworkManager[55529]: <info>  [1765433271.9844] manager: (tap49ac0b2b-42): new Tun device (/org/freedesktop/NetworkManager/Devices/52)
Dec 11 01:07:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:51.983 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:01:48 10.100.0.11'], port_security=['fa:16:3e:49:01:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb93259f-17ff-4ea0-aadc-09a566a9fe40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=125d7ef9-caf5-4c07-aba6-741106b35f5b, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=fb8865d1-91e3-4d6a-9437-231beabc5816) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:51.984 104320 INFO neutron.agent.ovn.metadata.agent [-] Port fb8865d1-91e3-4d6a-9437-231beabc5816 in datapath 92ebde34-cbee-4b5e-ac06-7fdddcde07a5 bound to our chassis#033[00m
Dec 11 01:07:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:51.986 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 92ebde34-cbee-4b5e-ac06-7fdddcde07a5#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.000 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5a328c7a-b018-4d1b-b467-2af609d26c96]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.001 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap92ebde34-c1 in ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.003 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap92ebde34-c0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.004 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c65fc8c3-ddb8-4617-a27c-1eb39eca4f1a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.004 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[53b18dac-42a5-4645-a0e0-e3322cb95034]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.016 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[51725364-7d67-4429-b5e0-f6400f2870ac]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 systemd-udevd[216069]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:07:52 np0005554845 systemd-machined[153381]: New machine qemu-6-instance-0000000d.
Dec 11 01:07:52 np0005554845 podman[216021]: 2025-12-11 06:07:52.048476304 +0000 UTC m=+0.094971805 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.054 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 systemd[1]: Started Virtual Machine qemu-6-instance-0000000d.
Dec 11 01:07:52 np0005554845 kernel: tap49ac0b2b-42: entered promiscuous mode
Dec 11 01:07:52 np0005554845 NetworkManager[55529]: <info>  [1765433272.0562] device (tap49ac0b2b-42): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:07:52 np0005554845 NetworkManager[55529]: <info>  [1765433272.0571] device (tap49ac0b2b-42): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.057 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00082|binding|INFO|Claiming lport 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 for this chassis.
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00083|binding|INFO|49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0: Claiming fa:16:3e:fe:6a:cf 2001:db8::f816:3eff:fefe:6acf
Dec 11 01:07:52 np0005554845 podman[216022]: 2025-12-11 06:07:52.059083181 +0000 UTC m=+0.103554838 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, version=9.6, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41)
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.063 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f87e9b83-c4f7-46b7-a61a-590206f45f8c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00084|binding|INFO|Setting lport fb8865d1-91e3-4d6a-9437-231beabc5816 ovn-installed in OVS
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00085|binding|INFO|Setting lport fb8865d1-91e3-4d6a-9437-231beabc5816 up in Southbound
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.068 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.068 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:6a:cf 2001:db8::f816:3eff:fefe:6acf'], port_security=['fa:16:3e:fe:6a:cf 2001:db8::f816:3eff:fefe:6acf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefe:6acf/64', 'neutron:device_id': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'fb93259f-17ff-4ea0-aadc-09a566a9fe40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2bfa7fb-80ee-49db-a84d-7d408b52f281, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00086|binding|INFO|Setting lport 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 ovn-installed in OVS
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.076 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00087|binding|INFO|Setting lport 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 up in Southbound
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.091 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[01567951-9c6c-43be-9d8a-f0d79d273890]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 NetworkManager[55529]: <info>  [1765433272.0967] manager: (tap92ebde34-c0): new Veth device (/org/freedesktop/NetworkManager/Devices/53)
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.096 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d8c5ca99-eda2-4fa5-b79d-f6654db39501]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.127 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[ecc4d00f-6991-4b40-8511-a9a88130a3d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.131 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[9fd9841b-2ddf-40bf-a806-96b1cf59d2fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 NetworkManager[55529]: <info>  [1765433272.1525] device (tap92ebde34-c0): carrier: link connected
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.157 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[d94944a4-73b1-4b32-9610-304d7a0f53aa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.172 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[15d8c3cb-b3ba-44db-8b92-3a98ced70ad9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92ebde34-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:a1:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354679, 'reachable_time': 16323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216111, 'error': None, 'target': 'ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.187 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[682b4391-7d8a-43e6-92da-e322ce0f6c98]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed8:a1b2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354679, 'tstamp': 354679}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216112, 'error': None, 'target': 'ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.202 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3a73d689-04be-4a7e-bed4-2caf3e05e6b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap92ebde34-c1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d8:a1:b2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 28], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354679, 'reachable_time': 16323, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216113, 'error': None, 'target': 'ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.234 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[98e6769d-4c46-43f9-b075-2ea73c8ae078]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.311 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[57b9b0b9-7b0d-4e4c-a36b-a24a0da8f5e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.313 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92ebde34-c0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.313 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.313 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap92ebde34-c0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.315 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 NetworkManager[55529]: <info>  [1765433272.3164] manager: (tap92ebde34-c0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/54)
Dec 11 01:07:52 np0005554845 kernel: tap92ebde34-c0: entered promiscuous mode
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.318 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.326 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap92ebde34-c0, col_values=(('external_ids', {'iface-id': '758c2e66-9229-4c0e-a50a-c862b3cbb788'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.327 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:52Z|00088|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.328 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.344 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.344 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/92ebde34-cbee-4b5e-ac06-7fdddcde07a5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/92ebde34-cbee-4b5e-ac06-7fdddcde07a5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.345 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f18928db-1aa4-4611-af01-34d3391f8cae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.346 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-92ebde34-cbee-4b5e-ac06-7fdddcde07a5
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/92ebde34-cbee-4b5e-ac06-7fdddcde07a5.pid.haproxy
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 92ebde34-cbee-4b5e-ac06-7fdddcde07a5
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.346 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'env', 'PROCESS_TAG=haproxy-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/92ebde34-cbee-4b5e-ac06-7fdddcde07a5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.566 187132 DEBUG nova.network.neutron [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Updating instance_info_cache with network_info: [{"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.593 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Releasing lock "refresh_cache-e259711f-fca8-4dd1-9fd0-b49e0404776f" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.611 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.612 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.612 187132 DEBUG oslo_concurrency.lockutils [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.allocate_pci_devices_for_instance" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.619 187132 INFO nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Sending announce-self command to QEMU monitor. Attempt 1 of 3#033[00m
Dec 11 01:07:52 np0005554845 virtqemud[186638]: Domain id=5 name='instance-0000000c' uuid=e259711f-fca8-4dd1-9fd0-b49e0404776f is tainted: custom-monitor
Dec 11 01:07:52 np0005554845 podman[216145]: 2025-12-11 06:07:52.761744792 +0000 UTC m=+0.058700272 container create b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.814 187132 DEBUG nova.network.neutron [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updated VIF entry in instance network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:07:52 np0005554845 systemd[1]: Started libpod-conmon-b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195.scope.
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.815 187132 DEBUG nova.network.neutron [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:52 np0005554845 podman[216145]: 2025-12-11 06:07:52.726022104 +0000 UTC m=+0.022977574 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:07:52 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:07:52 np0005554845 nova_compute[187128]: 2025-12-11 06:07:52.848 187132 DEBUG oslo_concurrency.lockutils [req-91a10dc5-47c0-4652-8d79-be55381570c0 req-a8f5bb12-51f6-4d67-b148-eb1b84459ddf eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:52 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a1e985ae3b9633304fab789af4b5a8f284c6b811b9a1e76f24a0b0cfb839e39b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:07:52 np0005554845 podman[216145]: 2025-12-11 06:07:52.862451111 +0000 UTC m=+0.159406591 container init b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:52 np0005554845 podman[216145]: 2025-12-11 06:07:52.868023132 +0000 UTC m=+0.164978582 container start b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:52 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [NOTICE]   (216164) : New worker (216166) forked
Dec 11 01:07:52 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [NOTICE]   (216164) : Loading success.
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.924 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 in datapath a2bcf811-4eea-465b-bdbf-ec77bd6ec91f unbound from our chassis#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.928 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network a2bcf811-4eea-465b-bdbf-ec77bd6ec91f#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.943 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5ac69678-dc68-4829-ac14-71d3387149e9]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.944 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapa2bcf811-41 in ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.946 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapa2bcf811-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.946 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ec90140c-2240-4463-8809-e8fd61a31627]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.946 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bd043468-0899-4b4b-a24c-2fb04d43a25b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.962 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[b2cfbb69-7a65-4580-92b2-f0af1a515b62]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.975 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4d81936b-3e60-4ed3-8774-d48d9f7a4ed6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:52.998 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[befbed9b-96ad-4d32-b151-8438b7737994]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 systemd-udevd[216096]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:07:53 np0005554845 NetworkManager[55529]: <info>  [1765433273.0041] manager: (tapa2bcf811-40): new Veth device (/org/freedesktop/NetworkManager/Devices/55)
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.003 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[824cc2d5-4943-4267-aed6-777752a38e1b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.034 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f92357e1-ba6b-4ba3-a205-d134e967f24c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.037 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[90c379d7-7abe-4868-9447-9d25ad11147f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 NetworkManager[55529]: <info>  [1765433273.0561] device (tapa2bcf811-40): carrier: link connected
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.059 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[a91fa4bc-82fe-4e81-a1eb-7bd4e9eeae2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.077 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cc03fa85-c2c4-45c0-9a44-b3a8f833f0ec]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2bcf811-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:f8:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354770, 'reachable_time': 39022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216185, 'error': None, 'target': 'ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.098 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9e547597-d3d8-4f14-ad23-d9791b853133]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe09:f804'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354770, 'tstamp': 354770}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216186, 'error': None, 'target': 'ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.117 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f5dca04a-fe08-43bd-be83-02c5ae581607]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapa2bcf811-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:09:f8:04'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 29], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354770, 'reachable_time': 39022, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216187, 'error': None, 'target': 'ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.147 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bece0a61-c612-49c4-8ed2-e6df7f001aeb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.178 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b7164d67-4136-4494-9929-ae0cf74f5006]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.179 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2bcf811-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.180 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.180 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapa2bcf811-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 kernel: tapa2bcf811-40: entered promiscuous mode
Dec 11 01:07:53 np0005554845 NetworkManager[55529]: <info>  [1765433273.1826] manager: (tapa2bcf811-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/56)
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.182 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.189 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapa2bcf811-40, col_values=(('external_ids', {'iface-id': '19fbb851-56d3-4e9c-872f-295bbcc3715e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.190 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:53Z|00089|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.201 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/a2bcf811-4eea-465b-bdbf-ec77bd6ec91f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/a2bcf811-4eea-465b-bdbf-ec77bd6ec91f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.203 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.202 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0cb462e2-91b4-4551-92bd-dfa1890ccb26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.203 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/a2bcf811-4eea-465b-bdbf-ec77bd6ec91f.pid.haproxy
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID a2bcf811-4eea-465b-bdbf-ec77bd6ec91f
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:07:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:53.203 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'env', 'PROCESS_TAG=haproxy-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/a2bcf811-4eea-465b-bdbf-ec77bd6ec91f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.347 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433273.3466618, 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.348 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] VM Started (Lifecycle Event)#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.364 187132 DEBUG nova.network.neutron [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Updating instance_info_cache with network_info: [{"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.376 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.380 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433273.3476243, 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.380 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.403 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Releasing lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.403 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Instance network_info: |[{"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.405 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Start _get_guest_xml network_info=[{"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.406 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.410 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.411 187132 WARNING nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.417 187132 DEBUG nova.virt.libvirt.host [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.418 187132 DEBUG nova.virt.libvirt.host [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.421 187132 DEBUG nova.virt.libvirt.host [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.422 187132 DEBUG nova.virt.libvirt.host [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.422 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.423 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.423 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.423 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.423 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.424 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.424 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.424 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.424 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.425 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.425 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.425 187132 DEBUG nova.virt.hardware [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.428 187132 DEBUG nova.virt.libvirt.vif [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-297968829',display_name='tempest-TestServerMultinode-server-297968829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-297968829',id=14,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f033bed42fce423089c02594df44ced9',ramdisk_id='',reservation_id='r-t08rd9ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1326182770',owner_user_name='tempest-TestServerMultinode-1326182770-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:46Z,user_data=None,user_id='bae38cdb18134ffe9b5f38f23622cd25',uuid=d2953461-e3c8-4475-978e-99fe1b807179,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.428 187132 DEBUG nova.network.os_vif_util [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converting VIF {"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.429 187132 DEBUG nova.network.os_vif_util [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.430 187132 DEBUG nova.objects.instance [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lazy-loading 'pci_devices' on Instance uuid d2953461-e3c8-4475-978e-99fe1b807179 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.434 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.442 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <uuid>d2953461-e3c8-4475-978e-99fe1b807179</uuid>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <name>instance-0000000e</name>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestServerMultinode-server-297968829</nova:name>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:07:53</nova:creationTime>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:user uuid="bae38cdb18134ffe9b5f38f23622cd25">tempest-TestServerMultinode-1326182770-project-admin</nova:user>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:project uuid="f033bed42fce423089c02594df44ced9">tempest-TestServerMultinode-1326182770</nova:project>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        <nova:port uuid="0a706dcf-eb29-4098-946a-e1a25e5587a8">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="serial">d2953461-e3c8-4475-978e-99fe1b807179</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="uuid">d2953461-e3c8-4475-978e-99fe1b807179</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.config"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:d8:5b:71"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <target dev="tap0a706dcf-eb"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/console.log" append="off"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:07:53 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:07:53 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:07:53 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:07:53 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.443 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Preparing to wait for external event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.443 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.444 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.444 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.444 187132 DEBUG nova.virt.libvirt.vif [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:07:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestServerMultinode-server-297968829',display_name='tempest-TestServerMultinode-server-297968829',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-297968829',id=14,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='f033bed42fce423089c02594df44ced9',ramdisk_id='',reservation_id='r-t08rd9ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestServerMultinode-1326182770',owner_user_name='tempest-TestServerMultinode-1326182770-project-admin'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:07:46Z,user_data=None,user_id='bae38cdb18134ffe9b5f38f23622cd25',uuid=d2953461-e3c8-4475-978e-99fe1b807179,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.445 187132 DEBUG nova.network.os_vif_util [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converting VIF {"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.445 187132 DEBUG nova.network.os_vif_util [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.445 187132 DEBUG os_vif [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.446 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.446 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.446 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.448 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.448 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0a706dcf-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.449 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0a706dcf-eb, col_values=(('external_ids', {'iface-id': '0a706dcf-eb29-4098-946a-e1a25e5587a8', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d8:5b:71', 'vm-uuid': 'd2953461-e3c8-4475-978e-99fe1b807179'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.450 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 NetworkManager[55529]: <info>  [1765433273.4510] manager: (tap0a706dcf-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/57)
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.455 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.456 187132 INFO os_vif [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb')#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.507 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.507 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.507 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] No VIF found with MAC fa:16:3e:d8:5b:71, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.508 187132 INFO nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Using config drive#033[00m
Dec 11 01:07:53 np0005554845 podman[216225]: 2025-12-11 06:07:53.538333996 +0000 UTC m=+0.048416543 container create 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:07:53 np0005554845 systemd[1]: Started libpod-conmon-41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0.scope.
Dec 11 01:07:53 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:07:53 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8633deebe954b1f6cab70e8110a2cd0e3300c5ea45ed287441f7eb5acbc10b9b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:07:53 np0005554845 podman[216225]: 2025-12-11 06:07:53.512217608 +0000 UTC m=+0.022300185 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:07:53 np0005554845 podman[216225]: 2025-12-11 06:07:53.6182083 +0000 UTC m=+0.128290897 container init 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:53 np0005554845 podman[216225]: 2025-12-11 06:07:53.624411918 +0000 UTC m=+0.134494465 container start 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 11 01:07:53 np0005554845 nova_compute[187128]: 2025-12-11 06:07:53.630 187132 INFO nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Sending announce-self command to QEMU monitor. Attempt 2 of 3#033[00m
Dec 11 01:07:53 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [NOTICE]   (216244) : New worker (216246) forked
Dec 11 01:07:53 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [NOTICE]   (216244) : Loading success.
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.002 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated VIF entry in instance network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.002 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.029 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.030 187132 DEBUG nova.compute.manager [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-changed-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.030 187132 DEBUG nova.compute.manager [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing instance network info cache due to event network-changed-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.030 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.031 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.031 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing network info cache for port 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.164 187132 INFO nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Creating config drive at /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.config#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.169 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_bs6ijj execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.292 187132 DEBUG oslo_concurrency.processutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpv_bs6ijj" returned: 0 in 0.123s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:07:54 np0005554845 kernel: tap0a706dcf-eb: entered promiscuous mode
Dec 11 01:07:54 np0005554845 NetworkManager[55529]: <info>  [1765433274.3617] manager: (tap0a706dcf-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/58)
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.363 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:54 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:54Z|00090|binding|INFO|Claiming lport 0a706dcf-eb29-4098-946a-e1a25e5587a8 for this chassis.
Dec 11 01:07:54 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:54Z|00091|binding|INFO|0a706dcf-eb29-4098-946a-e1a25e5587a8: Claiming fa:16:3e:d8:5b:71 10.100.0.6
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:54 np0005554845 NetworkManager[55529]: <info>  [1765433274.3753] device (tap0a706dcf-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:07:54 np0005554845 NetworkManager[55529]: <info>  [1765433274.3762] device (tap0a706dcf-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:07:54 np0005554845 systemd-machined[153381]: New machine qemu-7-instance-0000000e.
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.420 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:54 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:54Z|00092|binding|INFO|Setting lport 0a706dcf-eb29-4098-946a-e1a25e5587a8 ovn-installed in OVS
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.424 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:54 np0005554845 systemd[1]: Started Virtual Machine qemu-7-instance-0000000e.
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.637 187132 INFO nova.virt.libvirt.driver [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Sending announce-self command to QEMU monitor. Attempt 3 of 3#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.643 187132 DEBUG nova.compute.manager [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.730 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433274.7301922, d2953461-e3c8-4475-978e-99fe1b807179 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.731 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] VM Started (Lifecycle Event)#033[00m
Dec 11 01:07:54 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:54Z|00093|binding|INFO|Setting lport 0a706dcf-eb29-4098-946a-e1a25e5587a8 up in Southbound
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.760 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:5b:71 10.100.0.6'], port_security=['fa:16:3e:d8:5b:71 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd2953461-e3c8-4475-978e-99fe1b807179', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11d68746-7105-4c6c-a1c2-930f081e2867', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f033bed42fce423089c02594df44ced9', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'e44c51af-f4fc-4e7e-9665-16178aceb0df', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34a05a6c-1f90-4e8f-99db-f4d9b3f4fa41, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0a706dcf-eb29-4098-946a-e1a25e5587a8) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.761 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0a706dcf-eb29-4098-946a-e1a25e5587a8 in datapath 11d68746-7105-4c6c-a1c2-930f081e2867 bound to our chassis#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.764 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 11d68746-7105-4c6c-a1c2-930f081e2867#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.779 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[74b96a37-f0e4-4fe9-8f23-b80d5bd8392d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.780 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap11d68746-71 in ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.780 187132 DEBUG nova.objects.instance [None req-8db5d592-2ca7-403e-bc5e-063fcaf9f424 801136b467bd4ecabe1c9c48a9463a74 836da9a83db84b7bae37c12becc50ed3 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.782 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap11d68746-70 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.782 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a1938376-1841-4e36-ba65-4fd40cf9cee3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.783 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[146b390d-bda0-4610-854c-35575fae7e4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.787 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.792 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433274.7303283, d2953461-e3c8-4475-978e-99fe1b807179 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.793 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.795 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[c4886cb9-d8a1-42f5-9fc3-8e81081a81e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.821 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[59018d6c-ffbe-47d0-a82a-ac568f1c0d48]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.831 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.836 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.856 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0e5608bf-750d-468f-9073-0bb4d1a31f4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 NetworkManager[55529]: <info>  [1765433274.8618] manager: (tap11d68746-70): new Veth device (/org/freedesktop/NetworkManager/Devices/59)
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.862 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5f567246-8e81-45c1-8b9f-a505c9441fbf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.867 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:07:54 np0005554845 nova_compute[187128]: 2025-12-11 06:07:54.881 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.893 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[b7273a7e-cd41-4ae6-adf1-00c279d55533]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.897 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[da8f4166-89fe-47be-90dd-0ec4a611c72f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 NetworkManager[55529]: <info>  [1765433274.9231] device (tap11d68746-70): carrier: link connected
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.927 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[4579955d-ac7c-4522-b231-a44c97931962]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.944 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b98da2dc-b490-48f6-90d6-42a28bb1eaa8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11d68746-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:46:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354956, 'reachable_time': 42055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216296, 'error': None, 'target': 'ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.963 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[01bb0141-1025-414f-be63-a687fc3efd3d]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe2f:46a2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 354956, 'tstamp': 354956}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216297, 'error': None, 'target': 'ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:54.985 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[15415d40-c9d7-46c7-a634-a0a523bbfa0b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap11d68746-71'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:2f:46:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 31], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354956, 'reachable_time': 42055, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216298, 'error': None, 'target': 'ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.015 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b61ab669-3ac4-44c8-a868-3e0865db1c76]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.073 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[931a4862-f19b-491c-9240-99af35ef9f34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.075 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11d68746-70, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.075 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.076 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap11d68746-70, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.077 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:55 np0005554845 NetworkManager[55529]: <info>  [1765433275.0789] manager: (tap11d68746-70): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/60)
Dec 11 01:07:55 np0005554845 kernel: tap11d68746-70: entered promiscuous mode
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.080 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.081 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap11d68746-70, col_values=(('external_ids', {'iface-id': '62d8ffff-8a6f-415b-a1d8-3125ebbaf874'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.083 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:55Z|00094|binding|INFO|Releasing lport 62d8ffff-8a6f-415b-a1d8-3125ebbaf874 from this chassis (sb_readonly=0)
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.098 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.099 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/11d68746-7105-4c6c-a1c2-930f081e2867.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/11d68746-7105-4c6c-a1c2-930f081e2867.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.100 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2524e6a1-db4f-4a85-ab7a-bc7cb60ca716]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.101 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-11d68746-7105-4c6c-a1c2-930f081e2867
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/11d68746-7105-4c6c-a1c2-930f081e2867.pid.haproxy
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 11d68746-7105-4c6c-a1c2-930f081e2867
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:07:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:55.101 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867', 'env', 'PROCESS_TAG=haproxy-11d68746-7105-4c6c-a1c2-930f081e2867', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/11d68746-7105-4c6c-a1c2-930f081e2867.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.505 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433260.5044808, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.505 187132 INFO nova.compute.manager [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:07:55 np0005554845 podman[216330]: 2025-12-11 06:07:55.525907226 +0000 UTC m=+0.071425048 container create 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.533 187132 DEBUG nova.compute.manager [None req-e1a7ac1a-bce9-4465-8ddf-87e361656488 - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.539 187132 DEBUG nova.compute.manager [None req-e1a7ac1a-bce9-4465-8ddf-87e361656488 - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Synchronizing instance power state after lifecycle event "Stopped"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 4 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:55 np0005554845 nova_compute[187128]: 2025-12-11 06:07:55.564 187132 INFO nova.compute.manager [None req-e1a7ac1a-bce9-4465-8ddf-87e361656488 - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] During the sync_power process the instance has moved from host compute-1.ctlplane.example.com to host compute-2.ctlplane.example.com#033[00m
Dec 11 01:07:55 np0005554845 systemd[1]: Started libpod-conmon-79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719.scope.
Dec 11 01:07:55 np0005554845 podman[216330]: 2025-12-11 06:07:55.49248355 +0000 UTC m=+0.038001412 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:07:55 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:07:55 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3324b6b3557bdb01db873748671f00d5c40e8cb5808d3bb88a7b3d5d2a2a55d4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:07:55 np0005554845 podman[216330]: 2025-12-11 06:07:55.630671704 +0000 UTC m=+0.176189576 container init 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:07:55 np0005554845 podman[216330]: 2025-12-11 06:07:55.641689353 +0000 UTC m=+0.187207185 container start 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:07:55 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [NOTICE]   (216349) : New worker (216351) forked
Dec 11 01:07:55 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [NOTICE]   (216349) : Loading success.
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.300 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updated VIF entry in instance network info cache for port 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.301 187132 DEBUG nova.network.neutron [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.331 187132 DEBUG oslo_concurrency.lockutils [req-e529f426-29f6-4489-baf9-ae4b7d8f2865 req-adddd320-f498-4997-b1b0-aeab4e26587a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.566 187132 DEBUG nova.compute.manager [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received event network-changed-0a706dcf-eb29-4098-946a-e1a25e5587a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.566 187132 DEBUG nova.compute.manager [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Refreshing instance network info cache due to event network-changed-0a706dcf-eb29-4098-946a-e1a25e5587a8. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.567 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.567 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.567 187132 DEBUG nova.network.neutron [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Refreshing network info cache for port 0a706dcf-eb29-4098-946a-e1a25e5587a8 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.665 187132 DEBUG nova.compute.manager [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.665 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.666 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.666 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.666 187132 DEBUG nova.compute.manager [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Processing event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.666 187132 DEBUG nova.compute.manager [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.667 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.667 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.668 187132 DEBUG oslo_concurrency.lockutils [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.668 187132 DEBUG nova.compute.manager [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No event matching network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 in dict_keys([('network-vif-plugged', '49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Dec 11 01:07:56 np0005554845 nova_compute[187128]: 2025-12-11 06:07:56.668 187132 WARNING nova.compute.manager [req-5c5a891c-d13b-48b7-894c-00d651ecf110 req-8dd8bb35-5a4d-43dd-8973-c51c0b43aae4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received unexpected event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.450 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.704 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.704 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.705 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.705 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.705 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.706 187132 WARNING nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state resized and task_state resize_reverting.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.706 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.706 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.707 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.707 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.707 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.708 187132 WARNING nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state resized and task_state resize_reverting.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.708 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.708 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.708 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.709 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.709 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Processing event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.709 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.710 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.710 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.710 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.711 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No waiting events found dispatching network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.711 187132 WARNING nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received unexpected event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.711 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.712 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.712 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.712 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.713 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.713 187132 WARNING nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-unplugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state resized and task_state resize_reverting.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.713 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.714 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.714 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.714 187132 DEBUG oslo_concurrency.lockutils [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.715 187132 DEBUG nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.715 187132 WARNING nova.compute.manager [req-247238fa-72dc-4652-bf2e-afe3c0fc939e req-76099c1c-f061-41a0-a63f-365969a274d2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state resized and task_state resize_reverting.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.716 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Instance event wait completed in 5 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.727 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433278.727666, 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.728 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.730 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.734 187132 INFO nova.virt.libvirt.driver [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Instance spawned successfully.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.735 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.756 187132 DEBUG nova.network.neutron [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Updated VIF entry in instance network info cache for port 0a706dcf-eb29-4098-946a-e1a25e5587a8. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.757 187132 DEBUG nova.network.neutron [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Updating instance_info_cache with network_info: [{"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.767 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.772 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.773 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.773 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.774 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.774 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.774 187132 DEBUG nova.virt.libvirt.driver [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.778 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.809 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d2953461-e3c8-4475-978e-99fe1b807179" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.810 187132 DEBUG nova.compute.manager [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received event network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.810 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.810 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.810 187132 DEBUG oslo_concurrency.lockutils [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.810 187132 DEBUG nova.compute.manager [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] No waiting events found dispatching network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.811 187132 WARNING nova.compute.manager [req-c8ff20df-9176-4b1c-9e7f-0234aab550e7 req-66143eff-f15a-4a82-88cd-f2b97eca5670 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received unexpected event network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.815 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.842 187132 INFO nova.compute.manager [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Swapping old allocation on dict_keys(['eece7817-9d4f-4ebe-96c8-a659f76170f9']) held by migration 04048fc3-ae6d-4f80-b187-c86e827f5c56 for instance#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.853 187132 INFO nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Took 18.99 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.853 187132 DEBUG nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.880 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Acquiring lock "e259711f-fca8-4dd1-9fd0-b49e0404776f" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.881 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.881 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Acquiring lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.881 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.882 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.883 187132 INFO nova.compute.manager [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Terminating instance#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.884 187132 DEBUG nova.compute.manager [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.895 187132 DEBUG nova.scheduler.client.report [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Overwriting current allocation {'allocations': {'fdfa9d38-916c-46d3-83d0-a0a2657ef77d': {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}, 'generation': 14}}, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'consumer_generation': 1} on consumer d29187d8-59e6-4e5a-aef7-97fef6cf24c7 move_allocations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:2018#033[00m
Dec 11 01:07:58 np0005554845 kernel: tapc70283ea-f0 (unregistering): left promiscuous mode
Dec 11 01:07:58 np0005554845 NetworkManager[55529]: <info>  [1765433278.9048] device (tapc70283ea-f0): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.914 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00095|binding|INFO|Releasing lport c70283ea-f020-4b95-96ff-d6995a36ba20 from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00096|binding|INFO|Setting lport c70283ea-f020-4b95-96ff-d6995a36ba20 down in Southbound
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00097|binding|INFO|Releasing lport 5877ffb5-7529-4bf6-bc7f-c3f20519f897 from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00098|binding|INFO|Setting lport 5877ffb5-7529-4bf6-bc7f-c3f20519f897 down in Southbound
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00099|binding|INFO|Removing iface tapc70283ea-f0 ovn-installed in OVS
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00100|binding|INFO|Releasing lport b370d7e1-99df-455c-a61a-d60c4eb58c3f from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00101|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00102|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00103|binding|INFO|Releasing lport c46efaa2-51f3-49f2-88d1-957ae0b2127e from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:07:58Z|00104|binding|INFO|Releasing lport 62d8ffff-8a6f-415b-a1d8-3125ebbaf874 from this chassis (sb_readonly=0)
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.923 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ab:dd:f9 19.80.0.185'], port_security=['fa:16:3e:ab:dd:f9 19.80.0.185'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=['c70283ea-f020-4b95-96ff-d6995a36ba20'], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-subport-1138526966', 'neutron:cidrs': '19.80.0.185/24', 'neutron:device_id': '', 'neutron:device_owner': 'trunk:subport', 'neutron:mtu': '', 'neutron:network_name': 'neutron-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-subport-1138526966', 'neutron:project_id': '7936cace634747e4997212d1e4422555', 'neutron:revision_number': '5', 'neutron:security_group_ids': 'b088e580-6b47-485c-9cd1-4a797e9267e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[42], additional_encap=[], encap=[], mirror_rules=[], datapath=52455e06-6286-47a1-bae7-41dc34cce60e, chassis=[], tunnel_key=2, gateway_chassis=[], requested_chassis=[], logical_port=5877ffb5-7529-4bf6-bc7f-c3f20519f897) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.924 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:8d:41:0e 10.100.0.10'], port_security=['fa:16:3e:8d:41:0e 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'name': 'tempest-parent-812359412', 'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'e259711f-fca8-4dd1-9fd0-b49e0404776f', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'neutron:port_capabilities': '', 'neutron:port_name': 'tempest-parent-812359412', 'neutron:project_id': '7936cace634747e4997212d1e4422555', 'neutron:revision_number': '11', 'neutron:security_group_ids': 'b088e580-6b47-485c-9cd1-4a797e9267e3', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-0.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4b97d8aa-5e3a-46a8-a957-f048dac2ebf8, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=c70283ea-f020-4b95-96ff-d6995a36ba20) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.925 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 5877ffb5-7529-4bf6-bc7f-c3f20519f897 in datapath dcd4b1db-0d1a-44e0-b910-2ed7106fc09e unbound from our chassis#033[00m
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.928 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.928 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ff68945e-f7ef-48b7-a369-bc021ad86456]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:58.929 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e namespace which is not needed anymore#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.935 187132 INFO nova.compute.manager [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Took 19.48 seconds to build instance.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.940 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.957 187132 DEBUG oslo_concurrency.lockutils [None req-f734f45c-32f4-4018-8384-ab9bbd4ab76a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 19.581s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.977 187132 DEBUG nova.compute.manager [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.978 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.978 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.979 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.979 187132 DEBUG nova.compute.manager [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Processing event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.979 187132 DEBUG nova.compute.manager [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.979 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.979 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.980 187132 DEBUG oslo_concurrency.lockutils [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.980 187132 DEBUG nova.compute.manager [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] No waiting events found dispatching network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.980 187132 WARNING nova.compute.manager [req-1beb1f98-dbc4-410f-bbe5-2c02e4862e12 req-d95a6fa4-8c67-4262-b482-865da926ed72 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received unexpected event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.981 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.990 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433278.989272, d2953461-e3c8-4475-978e-99fe1b807179 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.990 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.993 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:07:58 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.999 187132 INFO nova.virt.libvirt.driver [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Instance spawned successfully.#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:58.999 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:07:59 np0005554845 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Deactivated successfully.
Dec 11 01:07:59 np0005554845 systemd[1]: machine-qemu\x2d5\x2dinstance\x2d0000000c.scope: Consumed 1.812s CPU time.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.002 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 systemd-machined[153381]: Machine qemu-5-instance-0000000c terminated.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.028 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.034 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.039 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.039 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.040 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.040 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.040 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.041 187132 DEBUG nova.virt.libvirt.driver [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [NOTICE]   (215919) : haproxy version is 2.8.14-c23fe91
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [NOTICE]   (215919) : path to executable is /usr/sbin/haproxy
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [WARNING]  (215919) : Exiting Master process...
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [WARNING]  (215919) : Exiting Master process...
Dec 11 01:07:59 np0005554845 systemd[1]: libpod-d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212.scope: Deactivated successfully.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.095 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [ALERT]    (215919) : Current worker (215921) exited with code 143 (Terminated)
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e[215915]: [WARNING]  (215919) : All workers exited. Exiting... (0)
Dec 11 01:07:59 np0005554845 conmon[215915]: conmon d7f4aac043023c6377cf <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212.scope/container/memory.events
Dec 11 01:07:59 np0005554845 podman[216382]: 2025-12-11 06:07:59.103129231 +0000 UTC m=+0.056989035 container died d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.112 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.116 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212-userdata-shm.mount: Deactivated successfully.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.149 187132 INFO nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Took 12.73 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.149 187132 DEBUG nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:07:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay-e12cb972cb79968f308fe5e78b1db125dc616fbac33687c55cf48a1ec3c94ca8-merged.mount: Deactivated successfully.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.164 187132 INFO nova.virt.libvirt.driver [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Instance destroyed successfully.#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.164 187132 DEBUG nova.objects.instance [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lazy-loading 'resources' on Instance uuid e259711f-fca8-4dd1-9fd0-b49e0404776f obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:07:59 np0005554845 podman[216382]: 2025-12-11 06:07:59.166905329 +0000 UTC m=+0.120765123 container cleanup d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:07:59 np0005554845 systemd[1]: libpod-conmon-d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212.scope: Deactivated successfully.
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.188 187132 DEBUG nova.virt.libvirt.vif [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:07:00Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description=None,display_name='tempest-LiveAutoBlockMigrationV225Test-server-2099766739',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-liveautoblockmigrationv225test-server-2099766739',id=12,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:16Z,launched_on='compute-0.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='7936cace634747e4997212d1e4422555',ramdisk_id='',reservation_id='r-z9l0w2g1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-LiveAutoBlockMigrationV225Test-1239256349',owner_user_name='tempest-LiveAutoBlockMigrationV225Test-1239256349-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:54Z,user_data=None,user_id='dc2400e30fa0477abb781abef37fc5a4',uuid=e259711f-fca8-4dd1-9fd0-b49e0404776f,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.189 187132 DEBUG nova.network.os_vif_util [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Converting VIF {"id": "c70283ea-f020-4b95-96ff-d6995a36ba20", "address": "fa:16:3e:8d:41:0e", "network": {"id": "88c1a45b-56e2-4aa6-a974-6011ef55c52b", "bridge": "br-int", "label": "tempest-LiveAutoBlockMigrationV225Test-210095557-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "7936cace634747e4997212d1e4422555", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapc70283ea-f0", "ovs_interfaceid": "c70283ea-f020-4b95-96ff-d6995a36ba20", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {"os_vif_delegation": true}, "preserve_on_delete": true, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.190 187132 DEBUG nova.network.os_vif_util [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.190 187132 DEBUG os_vif [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.192 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.193 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc70283ea-f0, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.194 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.196 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.198 187132 INFO os_vif [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:8d:41:0e,bridge_name='br-int',has_traffic_filtering=True,id=c70283ea-f020-4b95-96ff-d6995a36ba20,network=Network(88c1a45b-56e2-4aa6-a974-6011ef55c52b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=True,vif_name='tapc70283ea-f0')#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.199 187132 INFO nova.virt.libvirt.driver [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Deleting instance files /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f_del#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.200 187132 INFO nova.virt.libvirt.driver [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Deletion of /var/lib/nova/instances/e259711f-fca8-4dd1-9fd0-b49e0404776f_del complete#033[00m
Dec 11 01:07:59 np0005554845 podman[216428]: 2025-12-11 06:07:59.230235265 +0000 UTC m=+0.044969719 container remove d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.237 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b9d4668b-7a40-41b3-acd5-10abdf06ecd0]: (4, ('Thu Dec 11 06:07:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e (d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212)\nd7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212\nThu Dec 11 06:07:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e (d7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212)\nd7f4aac043023c6377cf026d54155f7aa06aa0af48877b5e67b91c9c1c7ac212\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.245 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[895d81e8-9bb8-49b7-ad77-9e62a1fb44e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.246 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapdcd4b1db-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:59 np0005554845 kernel: tapdcd4b1db-00: left promiscuous mode
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.250 187132 INFO nova.compute.manager [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Took 14.47 seconds to build instance.#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.252 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.254 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[87ca1fdd-22b0-4531-99ae-fae6022eb19e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.264 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.274 187132 INFO nova.compute.manager [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.275 187132 DEBUG oslo.service.loopingcall [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.276 187132 DEBUG nova.compute.manager [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.276 187132 DEBUG nova.network.neutron [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.278 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4a6e7da8-db73-4cb7-af40-6a755df97f8b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.280 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[897ebbe1-e1e1-4ba4-a3e3-866068420e72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.280 187132 DEBUG oslo_concurrency.lockutils [None req-85a55bb3-f6db-4765-8603-b29a2819e5e1 bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 14.616s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.293 187132 INFO nova.network.neutron [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating port aee944ef-3d55-4d72-85fd-0bcba5cebad9 with attributes {'binding:host_id': 'compute-2.ctlplane.example.com', 'device_owner': 'compute:nova'}#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.296 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c3820629-23ed-4649-b2d6-241f8c3edf8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354359, 'reachable_time': 18286, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216443, 'error': None, 'target': 'ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 systemd[1]: run-netns-ovnmeta\x2ddcd4b1db\x2d0d1a\x2d44e0\x2db910\x2d2ed7106fc09e.mount: Deactivated successfully.
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.303 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-dcd4b1db-0d1a-44e0-b910-2ed7106fc09e deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.303 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[091bf707-41b9-4785-a608-faf0717ece14]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.304 104320 INFO neutron.agent.ovn.metadata.agent [-] Port c70283ea-f020-4b95-96ff-d6995a36ba20 in datapath 88c1a45b-56e2-4aa6-a974-6011ef55c52b unbound from our chassis#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.306 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 88c1a45b-56e2-4aa6-a974-6011ef55c52b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.307 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a2a7dbdf-59ec-43bb-9eb1-71b4c099c321]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.307 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b namespace which is not needed anymore#033[00m
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [NOTICE]   (215996) : haproxy version is 2.8.14-c23fe91
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [NOTICE]   (215996) : path to executable is /usr/sbin/haproxy
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [WARNING]  (215996) : Exiting Master process...
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [ALERT]    (215996) : Current worker (215998) exited with code 143 (Terminated)
Dec 11 01:07:59 np0005554845 neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b[215992]: [WARNING]  (215996) : All workers exited. Exiting... (0)
Dec 11 01:07:59 np0005554845 systemd[1]: libpod-c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108.scope: Deactivated successfully.
Dec 11 01:07:59 np0005554845 conmon[215992]: conmon c576458841028ae291b7 <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108.scope/container/memory.events
Dec 11 01:07:59 np0005554845 podman[216460]: 2025-12-11 06:07:59.445044037 +0000 UTC m=+0.046348308 container died c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:07:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108-userdata-shm.mount: Deactivated successfully.
Dec 11 01:07:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay-59245bde2828d4faa2de49a2fbf8357d20eef2ded790bc6f110e77d97795e140-merged.mount: Deactivated successfully.
Dec 11 01:07:59 np0005554845 podman[216460]: 2025-12-11 06:07:59.479453089 +0000 UTC m=+0.080757370 container cleanup c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 01:07:59 np0005554845 systemd[1]: libpod-conmon-c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108.scope: Deactivated successfully.
Dec 11 01:07:59 np0005554845 podman[216491]: 2025-12-11 06:07:59.555673765 +0000 UTC m=+0.049686708 container remove c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.561 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[34a27135-be60-4f91-83e4-76063627a186]: (4, ('Thu Dec 11 06:07:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b (c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108)\nc576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108\nThu Dec 11 06:07:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b (c576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108)\nc576458841028ae291b701a746a84625976cc07dd7faf5094f0bfb2f50fc7108\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.563 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[73a18631-228d-4a5f-87c6-02c37dfcda07]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.564 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap88c1a45b-50, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.566 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 kernel: tap88c1a45b-50: left promiscuous mode
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.568 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.571 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[21f2ac87-3180-4e50-9cb4-4dabf39123e9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.583 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.596 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ce8aa9e4-a279-4b1e-a004-c032c3dde7bd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.598 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[08cae661-673a-437c-978c-e47b77e8050b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.618 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7a13c98c-6abf-488b-af9f-47888dd8574e]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354450, 'reachable_time': 20313, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216506, 'error': None, 'target': 'ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.620 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-88c1a45b-56e2-4aa6-a974-6011ef55c52b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:07:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:07:59.620 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[38c4eeb3-0170-438a-b6a4-b6b21714fffe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:07:59 np0005554845 nova_compute[187128]: 2025-12-11 06:07:59.882 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:00 np0005554845 systemd[1]: run-netns-ovnmeta\x2d88c1a45b\x2d56e2\x2d4aa6\x2da974\x2d6011ef55c52b.mount: Deactivated successfully.
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.418 187132 DEBUG nova.compute.manager [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received event network-vif-unplugged-c70283ea-f020-4b95-96ff-d6995a36ba20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.419 187132 DEBUG oslo_concurrency.lockutils [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.419 187132 DEBUG oslo_concurrency.lockutils [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.420 187132 DEBUG oslo_concurrency.lockutils [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.420 187132 DEBUG nova.compute.manager [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] No waiting events found dispatching network-vif-unplugged-c70283ea-f020-4b95-96ff-d6995a36ba20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.420 187132 DEBUG nova.compute.manager [req-d6502a87-6eab-4a60-b11e-e11fb8cb892f req-8b504b2c-818c-4cdf-b428-574a98a0a46c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received event network-vif-unplugged-c70283ea-f020-4b95-96ff-d6995a36ba20 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.496 187132 DEBUG nova.network.neutron [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.498 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.498 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.498 187132 DEBUG nova.network.neutron [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:08:01 np0005554845 nova_compute[187128]: 2025-12-11 06:08:01.524 187132 INFO nova.compute.manager [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Took 2.25 seconds to deallocate network for instance.#033[00m
Dec 11 01:08:02 np0005554845 nova_compute[187128]: 2025-12-11 06:08:02.064 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:02 np0005554845 nova_compute[187128]: 2025-12-11 06:08:02.064 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:02 np0005554845 nova_compute[187128]: 2025-12-11 06:08:02.070 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.006s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:02 np0005554845 nova_compute[187128]: 2025-12-11 06:08:02.118 187132 INFO nova.scheduler.client.report [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Deleted allocations for instance e259711f-fca8-4dd1-9fd0-b49e0404776f#033[00m
Dec 11 01:08:02 np0005554845 nova_compute[187128]: 2025-12-11 06:08:02.187 187132 DEBUG oslo_concurrency.lockutils [None req-e19653f3-6e22-4027-85ed-ea1b4fe3cd03 dc2400e30fa0477abb781abef37fc5a4 7936cace634747e4997212d1e4422555 - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.306s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.772 187132 DEBUG nova.network.neutron [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.794 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.794 187132 DEBUG nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Starting finish_revert_migration finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11843#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.802 187132 DEBUG nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start _get_guest_xml network_info=[{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.806 187132 WARNING nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.810 187132 DEBUG nova.virt.libvirt.host [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.811 187132 DEBUG nova.virt.libvirt.host [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.815 187132 DEBUG nova.virt.libvirt.host [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.816 187132 DEBUG nova.virt.libvirt.host [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.817 187132 DEBUG nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.818 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum=<?>,container_format='bare',created_at=<?>,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=1,min_ram=0,name=<?>,owner=<?>,properties=ImageMetaProps,protected=<?>,size=<?>,status=<?>,tags=<?>,updated_at=<?>,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.818 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.819 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.819 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.820 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.820 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.821 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.821 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.821 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.822 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.822 187132 DEBUG nova.virt.hardware [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.823 187132 DEBUG nova.objects.instance [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'vcpu_model' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.839 187132 DEBUG oslo_concurrency.processutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.897 187132 DEBUG oslo_concurrency.processutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.899 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.899 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.900 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.901 187132 DEBUG nova.virt.libvirt.vif [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:58Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.901 187132 DEBUG nova.network.os_vif_util [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.902 187132 DEBUG nova.network.os_vif_util [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.904 187132 DEBUG nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <uuid>d29187d8-59e6-4e5a-aef7-97fef6cf24c7</uuid>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <name>instance-0000000a</name>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1186378686</nova:name>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:08:03</nova:creationTime>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        <nova:port uuid="aee944ef-3d55-4d72-85fd-0bcba5cebad9">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="serial">d29187d8-59e6-4e5a-aef7-97fef6cf24c7</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="uuid">d29187d8-59e6-4e5a-aef7-97fef6cf24c7</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk.config"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:39:5c:9d"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <target dev="tapaee944ef-3d"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/console.log" append="off"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <input type="keyboard" bus="usb"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:08:03 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:08:03 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:08:03 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:08:03 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.910 187132 DEBUG nova.compute.manager [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Preparing to wait for external event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.910 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.910 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.911 187132 DEBUG oslo_concurrency.lockutils [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.911 187132 DEBUG nova.virt.libvirt.vif [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:50Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=MigrationContext,new_flavor=Flavor(1),node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=Flavor(1),os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='resize_reverting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:58Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='resized') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.912 187132 DEBUG nova.network.os_vif_util [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.912 187132 DEBUG nova.network.os_vif_util [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.913 187132 DEBUG os_vif [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.913 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.914 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.914 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.916 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.916 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapaee944ef-3d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.917 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapaee944ef-3d, col_values=(('external_ids', {'iface-id': 'aee944ef-3d55-4d72-85fd-0bcba5cebad9', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:39:5c:9d', 'vm-uuid': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.919 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:03 np0005554845 NetworkManager[55529]: <info>  [1765433283.9200] manager: (tapaee944ef-3d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/61)
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.921 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.923 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.924 187132 INFO os_vif [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d')#033[00m
Dec 11 01:08:03 np0005554845 nova_compute[187128]: 2025-12-11 06:08:03.994 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:03 np0005554845 NetworkManager[55529]: <info>  [1765433283.9952] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/62)
Dec 11 01:08:03 np0005554845 NetworkManager[55529]: <info>  [1765433283.9967] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/63)
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.100 187132 DEBUG nova.compute.manager [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received event network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.101 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.102 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.102 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e259711f-fca8-4dd1-9fd0-b49e0404776f-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.102 187132 DEBUG nova.compute.manager [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] No waiting events found dispatching network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.102 187132 WARNING nova.compute.manager [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Received unexpected event network-vif-plugged-c70283ea-f020-4b95-96ff-d6995a36ba20 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.103 187132 DEBUG nova.compute.manager [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.104 187132 DEBUG nova.compute.manager [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing instance network info cache due to event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.104 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.104 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.104 187132 DEBUG nova.network.neutron [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.1368] manager: (tapaee944ef-3d): new Tun device (/org/freedesktop/NetworkManager/Devices/64)
Dec 11 01:08:04 np0005554845 systemd-udevd[216527]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:08:04 np0005554845 systemd-machined[153381]: New machine qemu-8-instance-0000000a.
Dec 11 01:08:04 np0005554845 kernel: tapaee944ef-3d: entered promiscuous mode
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.2003] device (tapaee944ef-3d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.2019] device (tapaee944ef-3d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:08:04 np0005554845 systemd[1]: Started Virtual Machine qemu-8-instance-0000000a.
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.203 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00105|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00106|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00107|binding|INFO|Claiming lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 for this chassis.
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00108|binding|INFO|aee944ef-3d55-4d72-85fd-0bcba5cebad9: Claiming fa:16:3e:39:5c:9d 10.100.0.14
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00109|binding|INFO|Releasing lport 62d8ffff-8a6f-415b-a1d8-3125ebbaf874 from this chassis (sb_readonly=0)
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.240 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.247 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:5c:9d 10.100.0.14'], port_security=['fa:16:3e:39:5c:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '10', 'neutron:security_group_ids': '25d5132c-a309-410e-93c9-7759e7948f62', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.219'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114c3962-e260-4a4f-84c2-081b45071782, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=aee944ef-3d55-4d72-85fd-0bcba5cebad9) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.248 104320 INFO neutron.agent.ovn.metadata.agent [-] Port aee944ef-3d55-4d72-85fd-0bcba5cebad9 in datapath fa8f22dd-28ac-458d-9f63-a7d8a915d217 bound to our chassis#033[00m
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00110|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 ovn-installed in OVS
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00111|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 up in Southbound
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.252 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.253 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network fa8f22dd-28ac-458d-9f63-a7d8a915d217#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.268 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4a3dba-9ef6-4db0-8769-a4b362ea9183]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.270 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapfa8f22dd-21 in ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.272 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapfa8f22dd-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.273 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[dadea508-6aca-4882-bc3b-ad8767636fb4]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.274 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a978d581-cef4-4763-833d-361d347c42be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.293 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[5566afd7-bfcf-44e2-951e-6835c561dfd3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.311 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1fa6f522-f42b-4061-93e1-a343009f36ef]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.349 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[6687b811-0fb4-4e13-972f-9a8d3272f62f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.354 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c00f31fb-7f84-4ecb-b96b-9f7bf791fef7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.3558] manager: (tapfa8f22dd-20): new Veth device (/org/freedesktop/NetworkManager/Devices/65)
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.392 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[00aa8bd6-a91c-4971-9343-ac6ef30f7834]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.394 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[cc7c6b01-8212-4206-aa1d-78540ea4471c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.4182] device (tapfa8f22dd-20): carrier: link connected
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.423 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[3c0e1b9e-44b0-4686-b1fe-c36803b566b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.439 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8a186156-9896-46c0-8729-dd5e89f42c12]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa8f22dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:f5:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355906, 'reachable_time': 37287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216561, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.457 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f2177e4b-e4b8-477b-a10e-ca91d15b21c2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe07:f5a2'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 355906, 'tstamp': 355906}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 216562, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.471 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fa40e232-f4ad-4063-a812-9e21c2c101d3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapfa8f22dd-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:07:f5:a2'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 34], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355906, 'reachable_time': 37287, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 216563, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.504 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0c6bbe98-a833-4846-b981-c608f2b4658b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.547 187132 DEBUG nova.compute.manager [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.548 187132 DEBUG nova.compute.manager [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing instance network info cache due to event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.548 187132 DEBUG oslo_concurrency.lockutils [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.548 187132 DEBUG oslo_concurrency.lockutils [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.548 187132 DEBUG nova.network.neutron [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.580 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3a38dd9e-87c8-41ba-a24d-150cb7c204c6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.582 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa8f22dd-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.582 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.583 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapfa8f22dd-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:04 np0005554845 NetworkManager[55529]: <info>  [1765433284.5857] manager: (tapfa8f22dd-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/66)
Dec 11 01:08:04 np0005554845 kernel: tapfa8f22dd-20: entered promiscuous mode
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.584 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.589 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapfa8f22dd-20, col_values=(('external_ids', {'iface-id': 'c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:04Z|00112|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.590 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.604 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.610 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.611 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[26551356-2303-4d16-9b20-8d4ffae0ca83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.612 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-fa8f22dd-28ac-458d-9f63-a7d8a915d217
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/fa8f22dd-28ac-458d-9f63-a7d8a915d217.pid.haproxy
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID fa8f22dd-28ac-458d-9f63-a7d8a915d217
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:08:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:04.614 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'env', 'PROCESS_TAG=haproxy-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/fa8f22dd-28ac-458d-9f63-a7d8a915d217.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.744 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433284.7432907, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.744 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Started (Lifecycle Event)#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.768 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.773 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433284.7434409, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.773 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.794 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.797 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.823 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Dec 11 01:08:04 np0005554845 nova_compute[187128]: 2025-12-11 06:08:04.883 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:05 np0005554845 podman[216602]: 2025-12-11 06:08:05.016872042 +0000 UTC m=+0.060133130 container create 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:08:05 np0005554845 systemd[1]: Started libpod-conmon-842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b.scope.
Dec 11 01:08:05 np0005554845 podman[216602]: 2025-12-11 06:08:04.980944608 +0000 UTC m=+0.024205736 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:08:05 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:08:05 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/cbed6ccb7843ddac884ce7c7711f6271d30220ca39fd605573f306ba32c3b523/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:08:05 np0005554845 podman[216602]: 2025-12-11 06:08:05.109993246 +0000 UTC m=+0.153254334 container init 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 11 01:08:05 np0005554845 podman[216602]: 2025-12-11 06:08:05.119894664 +0000 UTC m=+0.163155742 container start 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 11 01:08:05 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [NOTICE]   (216621) : New worker (216623) forked
Dec 11 01:08:05 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [NOTICE]   (216621) : Loading success.
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.202 187132 DEBUG nova.compute.manager [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.203 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.203 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.203 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.203 187132 DEBUG nova.compute.manager [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Processing event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.204 187132 DEBUG nova.compute.manager [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.204 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.204 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.204 187132 DEBUG oslo_concurrency.lockutils [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.204 187132 DEBUG nova.compute.manager [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] No waiting events found dispatching network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.205 187132 WARNING nova.compute.manager [req-7bd34c6d-cb85-41ba-87fe-7805fc4447a2 req-522e6291-216c-48f9-8e7e-14b4c8b37005 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received unexpected event network-vif-plugged-aee944ef-3d55-4d72-85fd-0bcba5cebad9 for instance with vm_state resized and task_state resize_reverting.#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.205 187132 DEBUG nova.compute.manager [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.209 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433286.2091384, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.210 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.214 187132 INFO nova.virt.libvirt.driver [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance running successfully.#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.214 187132 DEBUG nova.virt.libvirt.driver [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] finish_revert_migration finished successfully. finish_revert_migration /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:11887#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.238 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.245 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: resized, current task_state: resize_reverting, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.288 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] During sync_power_state the instance has a pending task (resize_reverting). Skip.#033[00m
Dec 11 01:08:06 np0005554845 nova_compute[187128]: 2025-12-11 06:08:06.300 187132 INFO nova.compute.manager [None req-4ba6ea25-6dc9-44e1-8a4b-a2e8cbbb1077 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance to original state: 'active'#033[00m
Dec 11 01:08:08 np0005554845 nova_compute[187128]: 2025-12-11 06:08:08.091 187132 DEBUG nova.network.neutron [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updated VIF entry in instance network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:08:08 np0005554845 nova_compute[187128]: 2025-12-11 06:08:08.092 187132 DEBUG nova.network.neutron [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:08 np0005554845 nova_compute[187128]: 2025-12-11 06:08:08.110 187132 DEBUG oslo_concurrency.lockutils [req-f3ffeadc-b343-44fa-8a8f-5bdc940df307 req-5e7bf268-d7cd-4ab1-8056-5ab16d93d2ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:08:08 np0005554845 podman[216633]: 2025-12-11 06:08:08.129911759 +0000 UTC m=+0.062589106 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:08:08 np0005554845 nova_compute[187128]: 2025-12-11 06:08:08.920 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:09 np0005554845 nova_compute[187128]: 2025-12-11 06:08:09.413 187132 DEBUG nova.network.neutron [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated VIF entry in instance network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:08:09 np0005554845 nova_compute[187128]: 2025-12-11 06:08:09.414 187132 DEBUG nova.network.neutron [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:09 np0005554845 nova_compute[187128]: 2025-12-11 06:08:09.439 187132 DEBUG oslo_concurrency.lockutils [req-ebb52da6-1ac3-4496-b879-bfd8008a822c req-73fb62e9-b23d-46ac-b19d-21ff3dfcd239 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:08:09 np0005554845 nova_compute[187128]: 2025-12-11 06:08:09.884 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:13 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:13Z|00012|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:49:01:48 10.100.0.11
Dec 11 01:08:13 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:13Z|00013|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:49:01:48 10.100.0.11
Dec 11 01:08:13 np0005554845 nova_compute[187128]: 2025-12-11 06:08:13.975 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00014|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d8:5b:71 10.100.0.6
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00015|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d8:5b:71 10.100.0.6
Dec 11 01:08:14 np0005554845 podman[216690]: 2025-12-11 06:08:14.129512709 +0000 UTC m=+0.050607773 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:08:14 np0005554845 nova_compute[187128]: 2025-12-11 06:08:14.163 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433279.1615968, e259711f-fca8-4dd1-9fd0-b49e0404776f => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:14 np0005554845 nova_compute[187128]: 2025-12-11 06:08:14.163 187132 INFO nova.compute.manager [-] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:08:14 np0005554845 nova_compute[187128]: 2025-12-11 06:08:14.184 187132 DEBUG nova.compute.manager [None req-88e51ac3-960d-4535-af68-24632c01fa47 - - - - - -] [instance: e259711f-fca8-4dd1-9fd0-b49e0404776f] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:14 np0005554845 nova_compute[187128]: 2025-12-11 06:08:14.886 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00113|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00114|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00115|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:08:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:14Z|00116|binding|INFO|Releasing lport 62d8ffff-8a6f-415b-a1d8-3125ebbaf874 from this chassis (sb_readonly=0)
Dec 11 01:08:14 np0005554845 nova_compute[187128]: 2025-12-11 06:08:14.974 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:16 np0005554845 podman[216710]: 2025-12-11 06:08:16.145218271 +0000 UTC m=+0.079739612 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:08:16 np0005554845 podman[216711]: 2025-12-11 06:08:16.214278082 +0000 UTC m=+0.147940339 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.875 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.876 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.876 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.876 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.877 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.878 187132 INFO nova.compute.manager [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Terminating instance#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.879 187132 DEBUG nova.compute.manager [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:08:17 np0005554845 kernel: tap0a706dcf-eb (unregistering): left promiscuous mode
Dec 11 01:08:17 np0005554845 NetworkManager[55529]: <info>  [1765433297.9022] device (tap0a706dcf-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.914 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:17 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:17Z|00117|binding|INFO|Releasing lport 0a706dcf-eb29-4098-946a-e1a25e5587a8 from this chassis (sb_readonly=0)
Dec 11 01:08:17 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:17Z|00118|binding|INFO|Setting lport 0a706dcf-eb29-4098-946a-e1a25e5587a8 down in Southbound
Dec 11 01:08:17 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:17Z|00119|binding|INFO|Removing iface tap0a706dcf-eb ovn-installed in OVS
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.917 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:17.926 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d8:5b:71 10.100.0.6'], port_security=['fa:16:3e:d8:5b:71 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'd2953461-e3c8-4475-978e-99fe1b807179', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-11d68746-7105-4c6c-a1c2-930f081e2867', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'f033bed42fce423089c02594df44ced9', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'e44c51af-f4fc-4e7e-9665-16178aceb0df', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=34a05a6c-1f90-4e8f-99db-f4d9b3f4fa41, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0a706dcf-eb29-4098-946a-e1a25e5587a8) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:08:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:17.927 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0a706dcf-eb29-4098-946a-e1a25e5587a8 in datapath 11d68746-7105-4c6c-a1c2-930f081e2867 unbound from our chassis#033[00m
Dec 11 01:08:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:17.929 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 11d68746-7105-4c6c-a1c2-930f081e2867, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:08:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:17.932 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a8f1cba6-3d1e-4aa4-92ea-aa085ead0ad5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:17 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:17.932 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867 namespace which is not needed anymore#033[00m
Dec 11 01:08:17 np0005554845 nova_compute[187128]: 2025-12-11 06:08:17.937 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:17 np0005554845 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Deactivated successfully.
Dec 11 01:08:17 np0005554845 systemd[1]: machine-qemu\x2d7\x2dinstance\x2d0000000e.scope: Consumed 12.846s CPU time.
Dec 11 01:08:17 np0005554845 systemd-machined[153381]: Machine qemu-7-instance-0000000e terminated.
Dec 11 01:08:18 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [NOTICE]   (216349) : haproxy version is 2.8.14-c23fe91
Dec 11 01:08:18 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [NOTICE]   (216349) : path to executable is /usr/sbin/haproxy
Dec 11 01:08:18 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [ALERT]    (216349) : Current worker (216351) exited with code 143 (Terminated)
Dec 11 01:08:18 np0005554845 neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867[216345]: [WARNING]  (216349) : All workers exited. Exiting... (0)
Dec 11 01:08:18 np0005554845 systemd[1]: libpod-79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719.scope: Deactivated successfully.
Dec 11 01:08:18 np0005554845 podman[216786]: 2025-12-11 06:08:18.095577452 +0000 UTC m=+0.063004988 container died 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:08:18 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719-userdata-shm.mount: Deactivated successfully.
Dec 11 01:08:18 np0005554845 systemd[1]: var-lib-containers-storage-overlay-3324b6b3557bdb01db873748671f00d5c40e8cb5808d3bb88a7b3d5d2a2a55d4-merged.mount: Deactivated successfully.
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.141 187132 INFO nova.virt.libvirt.driver [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Instance destroyed successfully.#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.141 187132 DEBUG nova.objects.instance [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lazy-loading 'resources' on Instance uuid d2953461-e3c8-4475-978e-99fe1b807179 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:08:18 np0005554845 podman[216786]: 2025-12-11 06:08:18.143333176 +0000 UTC m=+0.110760712 container cleanup 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.156 187132 DEBUG nova.virt.libvirt.vif [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:07:43Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestServerMultinode-server-297968829',display_name='tempest-TestServerMultinode-server-297968829',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testservermultinode-server-297968829',id=14,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data=None,key_name=None,keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:59Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='f033bed42fce423089c02594df44ced9',ramdisk_id='',reservation_id='r-t08rd9ik',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,admin,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestServerMultinode-1326182770',owner_user_name='tempest-TestServerMultinode-1326182770-project-admin'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:59Z,user_data=None,user_id='bae38cdb18134ffe9b5f38f23622cd25',uuid=d2953461-e3c8-4475-978e-99fe1b807179,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.156 187132 DEBUG nova.network.os_vif_util [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converting VIF {"id": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "address": "fa:16:3e:d8:5b:71", "network": {"id": "11d68746-7105-4c6c-a1c2-930f081e2867", "bridge": "br-int", "label": "tempest-TestServerMultinode-656670444-network", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "0d95ba983e7b4ec7b161e0ab6b0b56ed", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0a706dcf-eb", "ovs_interfaceid": "0a706dcf-eb29-4098-946a-e1a25e5587a8", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.157 187132 DEBUG nova.network.os_vif_util [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.157 187132 DEBUG os_vif [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.159 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.159 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0a706dcf-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.162 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.163 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.166 187132 INFO os_vif [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d8:5b:71,bridge_name='br-int',has_traffic_filtering=True,id=0a706dcf-eb29-4098-946a-e1a25e5587a8,network=Network(11d68746-7105-4c6c-a1c2-930f081e2867),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0a706dcf-eb')#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.167 187132 INFO nova.virt.libvirt.driver [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Deleting instance files /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179_del#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.168 187132 INFO nova.virt.libvirt.driver [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Deletion of /var/lib/nova/instances/d2953461-e3c8-4475-978e-99fe1b807179_del complete#033[00m
Dec 11 01:08:18 np0005554845 systemd[1]: libpod-conmon-79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719.scope: Deactivated successfully.
Dec 11 01:08:18 np0005554845 podman[216833]: 2025-12-11 06:08:18.209441587 +0000 UTC m=+0.040548659 container remove 79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.214 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2b6d850b-18e5-4b32-8690-988a10aa15bd]: (4, ('Thu Dec 11 06:08:18 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867 (79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719)\n79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719\nThu Dec 11 06:08:18 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867 (79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719)\n79386c6bfdc2517c5651c84bbf701c462638a59319579c7af58d6d150a054719\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.215 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[88787b40-5b3a-4d37-87fc-d291683eba93]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.216 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap11d68746-70, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:18 np0005554845 kernel: tap11d68746-70: left promiscuous mode
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.217 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.224 187132 INFO nova.compute.manager [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.225 187132 DEBUG oslo.service.loopingcall [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.225 187132 DEBUG nova.compute.manager [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.225 187132 DEBUG nova.network.neutron [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:08:18 np0005554845 nova_compute[187128]: 2025-12-11 06:08:18.233 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.235 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f68ab5b5-ecbb-4671-a397-3cc06b9447d9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 podman[216827]: 2025-12-11 06:08:18.237163009 +0000 UTC m=+0.067602373 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.251 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9a7302e8-0c2c-4f16-ac07-ac147d3852dc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.252 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c80fe4a5-b07c-48b3-9aca-d2a5b8c612e0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.265 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a6f83768-0285-4163-b7cd-3572d5031202]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354949, 'reachable_time': 40876, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 216867, 'error': None, 'target': 'ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:18 np0005554845 systemd[1]: run-netns-ovnmeta\x2d11d68746\x2d7105\x2d4c6c\x2da1c2\x2d930f081e2867.mount: Deactivated successfully.
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.271 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-11d68746-7105-4c6c-a1c2-930f081e2867 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:08:18 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:18.271 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[6b3dd8fd-391f-4756-b207-03011b19e96b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:19Z|00016|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:39:5c:9d 10.100.0.14
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.604 187132 DEBUG nova.network.neutron [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.623 187132 INFO nova.compute.manager [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Took 1.40 seconds to deallocate network for instance.#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.657 187132 DEBUG nova.compute.manager [req-21524eda-086e-414b-b968-5959992d7cdc req-d3d147a3-c500-46cf-94ba-de78f7d4ffa6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received event network-vif-deleted-0a706dcf-eb29-4098-946a-e1a25e5587a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.662 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.663 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.832 187132 DEBUG nova.compute.manager [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.832 187132 DEBUG oslo_concurrency.lockutils [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "d2953461-e3c8-4475-978e-99fe1b807179-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.833 187132 DEBUG oslo_concurrency.lockutils [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.833 187132 DEBUG oslo_concurrency.lockutils [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.833 187132 DEBUG nova.compute.manager [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] No waiting events found dispatching network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.833 187132 WARNING nova.compute.manager [req-4d1b5b77-6bac-4bcf-a352-e42081a3dd02 req-46ab5a73-ddc5-437e-aa38-c2545882446a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Received unexpected event network-vif-plugged-0a706dcf-eb29-4098-946a-e1a25e5587a8 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.891 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:19 np0005554845 nova_compute[187128]: 2025-12-11 06:08:19.950 187132 DEBUG nova.compute.provider_tree [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.658 187132 DEBUG nova.scheduler.client.report [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.740 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.745 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.800 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 1.137s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.802 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.062s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.803 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.803 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.827 187132 INFO nova.scheduler.client.report [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Deleted allocations for instance d2953461-e3c8-4475-978e-99fe1b807179#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.890 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.919 187132 DEBUG oslo_concurrency.lockutils [None req-a1cde866-39bc-439e-920e-ef563d3caaed bae38cdb18134ffe9b5f38f23622cd25 f033bed42fce423089c02594df44ced9 - - default default] Lock "d2953461-e3c8-4475-978e-99fe1b807179" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.043s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.951 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:08:20 np0005554845 nova_compute[187128]: 2025-12-11 06:08:20.951 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.019 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d/disk --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.025 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.078 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.080 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.133 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.332 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.334 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5305MB free_disk=73.27349090576172GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.334 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.335 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.406 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.407 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance d29187d8-59e6-4e5a-aef7-97fef6cf24c7 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.407 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.407 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.459 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.483 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.509 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:08:21 np0005554845 nova_compute[187128]: 2025-12-11 06:08:21.510 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.175s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:22 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:22.187 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=7, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=6) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:08:22 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:22.188 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:08:22 np0005554845 nova_compute[187128]: 2025-12-11 06:08:22.188 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:22Z|00120|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:22Z|00121|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:22Z|00122|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:08:22 np0005554845 nova_compute[187128]: 2025-12-11 06:08:22.461 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:22 np0005554845 nova_compute[187128]: 2025-12-11 06:08:22.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:22 np0005554845 nova_compute[187128]: 2025-12-11 06:08:22.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:08:22 np0005554845 nova_compute[187128]: 2025-12-11 06:08:22.718 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:08:23 np0005554845 podman[216881]: 2025-12-11 06:08:23.137080939 +0000 UTC m=+0.057179470 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>)
Dec 11 01:08:23 np0005554845 podman[216882]: 2025-12-11 06:08:23.143716068 +0000 UTC m=+0.064358854 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., architecture=x86_64, vcs-type=git, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, io.buildah.version=1.33.7, name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.163 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:23 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:23.190 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '7'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.718 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.719 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.719 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.926 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.927 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.927 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:08:23 np0005554845 nova_compute[187128]: 2025-12-11 06:08:23.931 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:08:24 np0005554845 nova_compute[187128]: 2025-12-11 06:08:24.893 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.081 187132 INFO nova.compute.manager [None req-8f06fc90-5fd1-4204-ba53-d7a11cbe9a3c 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Get console output#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.086 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.302 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.339 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.339 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.340 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.340 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:25 np0005554845 nova_compute[187128]: 2025-12-11 06:08:25.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:26.220 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:26.221 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:26.222 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:26 np0005554845 nova_compute[187128]: 2025-12-11 06:08:26.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:26 np0005554845 nova_compute[187128]: 2025-12-11 06:08:26.710 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:26 np0005554845 nova_compute[187128]: 2025-12-11 06:08:26.710 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:26 np0005554845 nova_compute[187128]: 2025-12-11 06:08:26.710 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:08:26 np0005554845 nova_compute[187128]: 2025-12-11 06:08:26.711 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.293 187132 DEBUG nova.compute.manager [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.294 187132 DEBUG nova.compute.manager [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing instance network info cache due to event network-changed-aee944ef-3d55-4d72-85fd-0bcba5cebad9. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.294 187132 DEBUG oslo_concurrency.lockutils [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.294 187132 DEBUG oslo_concurrency.lockutils [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.294 187132 DEBUG nova.network.neutron [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Refreshing network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.391 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.392 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.392 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.394 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.394 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.395 187132 INFO nova.compute.manager [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Terminating instance#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.396 187132 DEBUG nova.compute.manager [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:08:27 np0005554845 kernel: tapaee944ef-3d (unregistering): left promiscuous mode
Dec 11 01:08:27 np0005554845 NetworkManager[55529]: <info>  [1765433307.4248] device (tapaee944ef-3d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00123|binding|INFO|Releasing lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 from this chassis (sb_readonly=0)
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.437 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00124|binding|INFO|Setting lport aee944ef-3d55-4d72-85fd-0bcba5cebad9 down in Southbound
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00125|binding|INFO|Removing iface tapaee944ef-3d ovn-installed in OVS
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.441 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.445 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:39:5c:9d 10.100.0.14'], port_security=['fa:16:3e:39:5c:9d 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'd29187d8-59e6-4e5a-aef7-97fef6cf24c7', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '12', 'neutron:security_group_ids': '25d5132c-a309-410e-93c9-7759e7948f62', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=114c3962-e260-4a4f-84c2-081b45071782, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=aee944ef-3d55-4d72-85fd-0bcba5cebad9) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.447 104320 INFO neutron.agent.ovn.metadata.agent [-] Port aee944ef-3d55-4d72-85fd-0bcba5cebad9 in datapath fa8f22dd-28ac-458d-9f63-a7d8a915d217 unbound from our chassis#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.449 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network fa8f22dd-28ac-458d-9f63-a7d8a915d217, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.450 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d636e67f-453e-402c-b2d5-3fdf3b251646]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.452 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 namespace which is not needed anymore#033[00m
Dec 11 01:08:27 np0005554845 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Deactivated successfully.
Dec 11 01:08:27 np0005554845 systemd[1]: machine-qemu\x2d8\x2dinstance\x2d0000000a.scope: Consumed 13.543s CPU time.
Dec 11 01:08:27 np0005554845 systemd-machined[153381]: Machine qemu-8-instance-0000000a terminated.
Dec 11 01:08:27 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [NOTICE]   (216621) : haproxy version is 2.8.14-c23fe91
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.619 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [NOTICE]   (216621) : path to executable is /usr/sbin/haproxy
Dec 11 01:08:27 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [WARNING]  (216621) : Exiting Master process...
Dec 11 01:08:27 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [ALERT]    (216621) : Current worker (216623) exited with code 143 (Terminated)
Dec 11 01:08:27 np0005554845 neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217[216617]: [WARNING]  (216621) : All workers exited. Exiting... (0)
Dec 11 01:08:27 np0005554845 systemd[1]: libpod-842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b.scope: Deactivated successfully.
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.626 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 podman[216951]: 2025-12-11 06:08:27.632094775 +0000 UTC m=+0.054535729 container died 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.654 187132 INFO nova.virt.libvirt.driver [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Instance destroyed successfully.#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.655 187132 DEBUG nova.objects.instance [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid d29187d8-59e6-4e5a-aef7-97fef6cf24c7 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.659 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.672 187132 DEBUG nova.virt.libvirt.vif [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:06:46Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1186378686',display_name='tempest-TestNetworkAdvancedServerOps-server-1186378686',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1186378686',id=10,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDvu269K3Wq5vyC2HL1b6A8dJTLjcpEsj7D9cVxVm9DHphj86xufSg/vW/M3Pt7tVDz6L3awCoDApZq7RNDhAAwSmH7Z/SFby/7dDKGWNDp4HLOCIl9fXp9onGKTfEF+xg==',key_name='tempest-TestNetworkAdvancedServerOps-1141582953',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:08:06Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-2cydsqis',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',clean_attempts='1',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:08:06Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=d29187d8-59e6-4e5a-aef7-97fef6cf24c7,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.673 187132 DEBUG nova.network.os_vif_util [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.219", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:08:27 np0005554845 systemd[1]: var-lib-containers-storage-overlay-cbed6ccb7843ddac884ce7c7711f6271d30220ca39fd605573f306ba32c3b523-merged.mount: Deactivated successfully.
Dec 11 01:08:27 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b-userdata-shm.mount: Deactivated successfully.
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.673 187132 DEBUG nova.network.os_vif_util [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.674 187132 DEBUG os_vif [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.675 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.676 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapaee944ef-3d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.677 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.679 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 podman[216951]: 2025-12-11 06:08:27.681173565 +0000 UTC m=+0.103614519 container cleanup 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.683 187132 INFO os_vif [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:39:5c:9d,bridge_name='br-int',has_traffic_filtering=True,id=aee944ef-3d55-4d72-85fd-0bcba5cebad9,network=Network(fa8f22dd-28ac-458d-9f63-a7d8a915d217),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapaee944ef-3d')#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.683 187132 INFO nova.virt.libvirt.driver [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Deleting instance files /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_del#033[00m
Dec 11 01:08:27 np0005554845 systemd[1]: libpod-conmon-842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b.scope: Deactivated successfully.
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.690 187132 INFO nova.virt.libvirt.driver [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Deletion of /var/lib/nova/instances/d29187d8-59e6-4e5a-aef7-97fef6cf24c7_del complete#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.702 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00126|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00127|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:27Z|00128|binding|INFO|Releasing lport c1aa7c5f-7cb5-4f8e-b844-cd400103ee8b from this chassis (sb_readonly=0)
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.741 187132 INFO nova.compute.manager [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.742 187132 DEBUG oslo.service.loopingcall [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.742 187132 DEBUG nova.compute.manager [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.743 187132 DEBUG nova.network.neutron [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:08:27 np0005554845 podman[216993]: 2025-12-11 06:08:27.757216565 +0000 UTC m=+0.049728748 container remove 842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.762 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4bac7d16-d778-4dc0-92f6-2f5017ca9eab]: (4, ('Thu Dec 11 06:08:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 (842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b)\n842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b\nThu Dec 11 06:08:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 (842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b)\n842ce8040b453267f717ebc52b5cce146233593e5cae7e0630d417f44d982a0b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.764 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1e207ab4-07a5-4e5d-a006-3e24b3758298]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.765 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfa8f22dd-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.766 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 kernel: tapfa8f22dd-20: left promiscuous mode
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.808 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.812 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f146246a-5651-4c70-9714-3cda834b20e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.816 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 nova_compute[187128]: 2025-12-11 06:08:27.830 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.841 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5a0ddec3-cb76-4c93-b64f-7f3753465cb6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.843 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e3348dd0-c052-49ed-bcba-eb97737dfadf]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.862 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[64eee9c6-7c39-4d75-9bf9-3e79b93ee869]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 355898, 'reachable_time': 44679, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217008, 'error': None, 'target': 'ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.866 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-fa8f22dd-28ac-458d-9f63-a7d8a915d217 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:08:27 np0005554845 systemd[1]: run-netns-ovnmeta\x2dfa8f22dd\x2d28ac\x2d458d\x2d9f63\x2da7d8a915d217.mount: Deactivated successfully.
Dec 11 01:08:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:08:27.866 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[78ca3d34-f60d-4fcc-9289-f5fc08514844]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.266 187132 DEBUG nova.network.neutron [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.289 187132 INFO nova.compute.manager [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Took 1.55 seconds to deallocate network for instance.#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.330 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.330 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.408 187132 DEBUG nova.compute.provider_tree [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.443 187132 DEBUG nova.scheduler.client.report [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.473 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.495 187132 INFO nova.scheduler.client.report [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocations for instance d29187d8-59e6-4e5a-aef7-97fef6cf24c7#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.530 187132 DEBUG nova.compute.manager [req-40fe2284-8011-4ddd-aaa8-e2639ca7b758 req-95dc461d-db79-4722-82d2-4f2230b60e2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Received event network-vif-deleted-aee944ef-3d55-4d72-85fd-0bcba5cebad9 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.576 187132 DEBUG oslo_concurrency.lockutils [None req-ba2d536a-7933-4a51-bd8c-b6cff0b675e8 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "d29187d8-59e6-4e5a-aef7-97fef6cf24c7" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.184s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.740 187132 DEBUG nova.network.neutron [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updated VIF entry in instance network info cache for port aee944ef-3d55-4d72-85fd-0bcba5cebad9. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.740 187132 DEBUG nova.network.neutron [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Updating instance_info_cache with network_info: [{"id": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "address": "fa:16:3e:39:5c:9d", "network": {"id": "fa8f22dd-28ac-458d-9f63-a7d8a915d217", "bridge": "br-int", "label": "tempest-network-smoke--1045645596", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapaee944ef-3d", "ovs_interfaceid": "aee944ef-3d55-4d72-85fd-0bcba5cebad9", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.772 187132 DEBUG oslo_concurrency.lockutils [req-437606ea-c915-4a4d-b112-47a6598823a5 req-17a15e6f-ba18-4c06-856d-69c002f48cff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-d29187d8-59e6-4e5a-aef7-97fef6cf24c7" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:08:29 np0005554845 nova_compute[187128]: 2025-12-11 06:08:29.895 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:31Z|00129|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:31Z|00130|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:31 np0005554845 nova_compute[187128]: 2025-12-11 06:08:31.765 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:32 np0005554845 nova_compute[187128]: 2025-12-11 06:08:32.679 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:33 np0005554845 nova_compute[187128]: 2025-12-11 06:08:33.139 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433298.1374154, d2953461-e3c8-4475-978e-99fe1b807179 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:33 np0005554845 nova_compute[187128]: 2025-12-11 06:08:33.140 187132 INFO nova.compute.manager [-] [instance: d2953461-e3c8-4475-978e-99fe1b807179] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:08:33 np0005554845 nova_compute[187128]: 2025-12-11 06:08:33.162 187132 DEBUG nova.compute.manager [None req-a2925bce-1189-4cc9-ba5d-35e02b409f49 - - - - - -] [instance: d2953461-e3c8-4475-978e-99fe1b807179] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:34 np0005554845 nova_compute[187128]: 2025-12-11 06:08:34.898 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:35 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:35Z|00131|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:35 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:35Z|00132|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:35 np0005554845 nova_compute[187128]: 2025-12-11 06:08:35.627 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:36 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:36Z|00133|binding|INFO|Releasing lport 758c2e66-9229-4c0e-a50a-c862b3cbb788 from this chassis (sb_readonly=0)
Dec 11 01:08:36 np0005554845 ovn_controller[95428]: 2025-12-11T06:08:36Z|00134|binding|INFO|Releasing lport 19fbb851-56d3-4e9c-872f-295bbcc3715e from this chassis (sb_readonly=0)
Dec 11 01:08:36 np0005554845 nova_compute[187128]: 2025-12-11 06:08:36.563 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:37 np0005554845 nova_compute[187128]: 2025-12-11 06:08:37.683 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:39 np0005554845 podman[217009]: 2025-12-11 06:08:39.135903544 +0000 UTC m=+0.061650801 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:08:39 np0005554845 nova_compute[187128]: 2025-12-11 06:08:39.900 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:42 np0005554845 nova_compute[187128]: 2025-12-11 06:08:42.653 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433307.6522593, d29187d8-59e6-4e5a-aef7-97fef6cf24c7 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:08:42 np0005554845 nova_compute[187128]: 2025-12-11 06:08:42.654 187132 INFO nova.compute.manager [-] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:08:42 np0005554845 nova_compute[187128]: 2025-12-11 06:08:42.669 187132 DEBUG nova.compute.manager [None req-70f6ce6e-ceae-4e2e-9ed2-9310b0f9fcfd - - - - - -] [instance: d29187d8-59e6-4e5a-aef7-97fef6cf24c7] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:08:42 np0005554845 nova_compute[187128]: 2025-12-11 06:08:42.687 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:44 np0005554845 nova_compute[187128]: 2025-12-11 06:08:44.902 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:45 np0005554845 podman[217033]: 2025-12-11 06:08:45.130586949 +0000 UTC m=+0.068181109 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:08:46 np0005554845 nova_compute[187128]: 2025-12-11 06:08:46.183 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:47 np0005554845 podman[217053]: 2025-12-11 06:08:47.112278639 +0000 UTC m=+0.049655236 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:08:47 np0005554845 podman[217054]: 2025-12-11 06:08:47.153878557 +0000 UTC m=+0.085545349 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:08:47 np0005554845 nova_compute[187128]: 2025-12-11 06:08:47.690 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:49 np0005554845 podman[217098]: 2025-12-11 06:08:49.167701587 +0000 UTC m=+0.082571579 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd)
Dec 11 01:08:49 np0005554845 nova_compute[187128]: 2025-12-11 06:08:49.904 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:52 np0005554845 nova_compute[187128]: 2025-12-11 06:08:52.692 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:54 np0005554845 podman[217118]: 2025-12-11 06:08:54.132655638 +0000 UTC m=+0.059459021 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:08:54 np0005554845 podman[217119]: 2025-12-11 06:08:54.148340703 +0000 UTC m=+0.071367935 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, distribution-scope=public, maintainer=Red Hat, Inc., release=1755695350, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, config_id=edpm, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, version=9.6, vcs-type=git, build-date=2025-08-20T13:12:41)
Dec 11 01:08:54 np0005554845 nova_compute[187128]: 2025-12-11 06:08:54.905 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:56 np0005554845 nova_compute[187128]: 2025-12-11 06:08:56.190 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:57 np0005554845 nova_compute[187128]: 2025-12-11 06:08:57.695 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:08:58 np0005554845 nova_compute[187128]: 2025-12-11 06:08:58.719 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:08:58 np0005554845 nova_compute[187128]: 2025-12-11 06:08:58.922 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Triggering sync for uuid 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec 11 01:08:58 np0005554845 nova_compute[187128]: 2025-12-11 06:08:58.923 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:08:58 np0005554845 nova_compute[187128]: 2025-12-11 06:08:58.924 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:08:58 np0005554845 nova_compute[187128]: 2025-12-11 06:08:58.963 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.039s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:08:59 np0005554845 nova_compute[187128]: 2025-12-11 06:08:59.908 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.574 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.575 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.599 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.675 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.675 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.683 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.683 187132 INFO nova.compute.claims [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.832 187132 DEBUG nova.compute.provider_tree [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.848 187132 DEBUG nova.scheduler.client.report [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.866 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.190s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.866 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.930 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.931 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.956 187132 INFO nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:09:01 np0005554845 nova_compute[187128]: 2025-12-11 06:09:01.980 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.092 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.093 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.093 187132 INFO nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Creating image(s)#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.094 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.094 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.095 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.106 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.152 187132 DEBUG nova.policy [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.157 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.051s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.158 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.158 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.169 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.219 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.050s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.221 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.265 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk 1073741824" returned: 0 in 0.044s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.266 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.267 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.335 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.336 187132 DEBUG nova.virt.disk.api [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.336 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.390 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.392 187132 DEBUG nova.virt.disk.api [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.392 187132 DEBUG nova.objects.instance [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid 99e00ae7-84c5-40a5-a280-10071d1df3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.408 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.409 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Ensure instance console log exists: /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.409 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.410 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.410 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:02 np0005554845 nova_compute[187128]: 2025-12-11 06:09:02.751 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:03 np0005554845 nova_compute[187128]: 2025-12-11 06:09:03.917 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Successfully created port: f3f6ac24-c680-4caa-b2c1-d317380417d7 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:09:04 np0005554845 nova_compute[187128]: 2025-12-11 06:09:04.943 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.391 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Successfully updated port: f3f6ac24-c680-4caa-b2c1-d317380417d7 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.410 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.410 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.410 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.526 187132 DEBUG nova.compute.manager [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.527 187132 DEBUG nova.compute.manager [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing instance network info cache due to event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:09:05 np0005554845 nova_compute[187128]: 2025-12-11 06:09:05.527 187132 DEBUG oslo_concurrency.lockutils [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:09:06 np0005554845 nova_compute[187128]: 2025-12-11 06:09:06.197 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:09:07 np0005554845 nova_compute[187128]: 2025-12-11 06:09:07.754 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.614 187132 DEBUG nova.network.neutron [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updating instance_info_cache with network_info: [{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.645 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.645 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Instance network_info: |[{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.647 187132 DEBUG oslo_concurrency.lockutils [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.647 187132 DEBUG nova.network.neutron [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.652 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Start _get_guest_xml network_info=[{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.660 187132 WARNING nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.667 187132 DEBUG nova.virt.libvirt.host [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.668 187132 DEBUG nova.virt.libvirt.host [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.678 187132 DEBUG nova.virt.libvirt.host [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.679 187132 DEBUG nova.virt.libvirt.host [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.681 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.682 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.683 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.683 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.683 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.684 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.684 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.685 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.685 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.686 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.686 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.686 187132 DEBUG nova.virt.hardware [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.693 187132 DEBUG nova.virt.libvirt.vif [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508444165',display_name='tempest-TestNetworkAdvancedServerOps-server-1508444165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508444165',id=19,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFubs0ldAofD3ANccMMmgq8hmjh8AurFdwi5RaewoBl6Y+Vx1ron6toTcXpzNxLzsBUlrUV4uy79ncS16TnDcm8ejEkhsGMufiWD1vESOHX2y+PrpdIcJjO90kjoZ11BNw==',key_name='tempest-TestNetworkAdvancedServerOps-735374275',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-r0fn2m1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:09:02Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=99e00ae7-84c5-40a5-a280-10071d1df3f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.694 187132 DEBUG nova.network.os_vif_util [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.695 187132 DEBUG nova.network.os_vif_util [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.697 187132 DEBUG nova.objects.instance [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid 99e00ae7-84c5-40a5-a280-10071d1df3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.710 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <uuid>99e00ae7-84c5-40a5-a280-10071d1df3f0</uuid>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <name>instance-00000013</name>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1508444165</nova:name>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:09:08</nova:creationTime>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        <nova:port uuid="f3f6ac24-c680-4caa-b2c1-d317380417d7">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.8" ipVersion="4"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="serial">99e00ae7-84c5-40a5-a280-10071d1df3f0</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="uuid">99e00ae7-84c5-40a5-a280-10071d1df3f0</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.config"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:91:84:93"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <target dev="tapf3f6ac24-c6"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/console.log" append="off"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:09:08 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:09:08 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:09:08 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:09:08 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.713 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Preparing to wait for external event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.713 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.714 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.714 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.716 187132 DEBUG nova.virt.libvirt.vif [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508444165',display_name='tempest-TestNetworkAdvancedServerOps-server-1508444165',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508444165',id=19,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFubs0ldAofD3ANccMMmgq8hmjh8AurFdwi5RaewoBl6Y+Vx1ron6toTcXpzNxLzsBUlrUV4uy79ncS16TnDcm8ejEkhsGMufiWD1vESOHX2y+PrpdIcJjO90kjoZ11BNw==',key_name='tempest-TestNetworkAdvancedServerOps-735374275',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-r0fn2m1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:09:02Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=99e00ae7-84c5-40a5-a280-10071d1df3f0,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.717 187132 DEBUG nova.network.os_vif_util [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.718 187132 DEBUG nova.network.os_vif_util [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.718 187132 DEBUG os_vif [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.719 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.720 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.721 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.725 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.726 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapf3f6ac24-c6, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.726 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tapf3f6ac24-c6, col_values=(('external_ids', {'iface-id': 'f3f6ac24-c680-4caa-b2c1-d317380417d7', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:91:84:93', 'vm-uuid': '99e00ae7-84c5-40a5-a280-10071d1df3f0'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.729 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:08 np0005554845 NetworkManager[55529]: <info>  [1765433348.7302] manager: (tapf3f6ac24-c6): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/67)
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.731 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.739 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.742 187132 INFO os_vif [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6')#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.805 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.806 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.806 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:91:84:93, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:09:08 np0005554845 nova_compute[187128]: 2025-12-11 06:09:08.807 187132 INFO nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Using config drive#033[00m
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.583 187132 INFO nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Creating config drive at /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.config#033[00m
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.596 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcs51cefy execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.737 187132 DEBUG oslo_concurrency.processutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpcs51cefy" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:09 np0005554845 kernel: tapf3f6ac24-c6: entered promiscuous mode
Dec 11 01:09:09 np0005554845 NetworkManager[55529]: <info>  [1765433349.8134] manager: (tapf3f6ac24-c6): new Tun device (/org/freedesktop/NetworkManager/Devices/68)
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.814 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:09Z|00135|binding|INFO|Claiming lport f3f6ac24-c680-4caa-b2c1-d317380417d7 for this chassis.
Dec 11 01:09:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:09Z|00136|binding|INFO|f3f6ac24-c680-4caa-b2c1-d317380417d7: Claiming fa:16:3e:91:84:93 10.100.0.8
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.826 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:84:93 10.100.0.8'], port_security=['fa:16:3e:91:84:93 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': '65e258bf-a170-4ed6-bb0c-cb9465ced260', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4770d3b8-5c7f-4649-944d-65f56c7b9c25, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=f3f6ac24-c680-4caa-b2c1-d317380417d7) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.827 104320 INFO neutron.agent.ovn.metadata.agent [-] Port f3f6ac24-c680-4caa-b2c1-d317380417d7 in datapath 7d668d5f-0b74-4535-a166-89784d7ca5e9 bound to our chassis#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.828 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 7d668d5f-0b74-4535-a166-89784d7ca5e9#033[00m
Dec 11 01:09:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:09Z|00137|binding|INFO|Setting lport f3f6ac24-c680-4caa-b2c1-d317380417d7 ovn-installed in OVS
Dec 11 01:09:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:09Z|00138|binding|INFO|Setting lport f3f6ac24-c680-4caa-b2c1-d317380417d7 up in Southbound
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.835 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.842 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e8f2c764-ab19-49e6-918a-65a4f29447d1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.843 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap7d668d5f-01 in ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.845 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap7d668d5f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.845 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d5b5a81-9646-4050-bf4a-f9c25fb85242]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.845 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a9455140-3838-49e4-a8c6-582c3d634123]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 systemd-udevd[217213]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.856 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[7e5b8cff-164f-4801-9ebb-4e5d37cf3e26]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 systemd-machined[153381]: New machine qemu-9-instance-00000013.
Dec 11 01:09:09 np0005554845 NetworkManager[55529]: <info>  [1765433349.8695] device (tapf3f6ac24-c6): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:09:09 np0005554845 NetworkManager[55529]: <info>  [1765433349.8704] device (tapf3f6ac24-c6): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.874 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[05dc2864-dce4-451a-8127-2f6fbcfbf19c]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 systemd[1]: Started Virtual Machine qemu-9-instance-00000013.
Dec 11 01:09:09 np0005554845 podman[217191]: 2025-12-11 06:09:09.89014423 +0000 UTC m=+0.084402148 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.903 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[39cf2172-f880-4488-92e9-c03682d70cf1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.908 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[697e7470-6f70-4d1d-9bd4-6c52364be71d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 NetworkManager[55529]: <info>  [1765433349.9090] manager: (tap7d668d5f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/69)
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.938 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[fd0f6d2b-0588-4c30-ae17-49d3dda787ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.941 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[b890e796-12eb-4b26-952e-0085674f36b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 nova_compute[187128]: 2025-12-11 06:09:09.945 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:09 np0005554845 NetworkManager[55529]: <info>  [1765433349.9688] device (tap7d668d5f-00): carrier: link connected
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.973 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[52de6aeb-1c6f-4ccb-90e9-5ab69b83e9b1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:09.994 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[18f1b9f6-205c-4a39-90fe-e449d1ceffc1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d668d5f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:37:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362461, 'reachable_time': 20078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217255, 'error': None, 'target': 'ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.012 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5ed9c421-fddd-43ce-bcf8-116c9869c6d4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed2:3727'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 362461, 'tstamp': 362461}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217256, 'error': None, 'target': 'ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.029 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[464ca237-34a3-4f54-b53d-8ebd85702969]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap7d668d5f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d2:37:27'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 38], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362461, 'reachable_time': 20078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217257, 'error': None, 'target': 'ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.066 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4f4e0efc-80b0-426f-a482-26b3a0dcf49d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.134 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4903294c-8da7-440d-bbd2-15d34bd2a630]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.135 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d668d5f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.136 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.136 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7d668d5f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.138 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:10 np0005554845 NetworkManager[55529]: <info>  [1765433350.1387] manager: (tap7d668d5f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/70)
Dec 11 01:09:10 np0005554845 kernel: tap7d668d5f-00: entered promiscuous mode
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.140 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap7d668d5f-00, col_values=(('external_ids', {'iface-id': '46546633-a2a1-4a03-ac9c-5c5e47374adc'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.141 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:10 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:10Z|00139|binding|INFO|Releasing lport 46546633-a2a1-4a03-ac9c-5c5e47374adc from this chassis (sb_readonly=0)
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.158 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.159 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.160 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/7d668d5f-0b74-4535-a166-89784d7ca5e9.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/7d668d5f-0b74-4535-a166-89784d7ca5e9.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.161 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e0d2ed52-84e0-4896-bf3e-99eea2fe43c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.162 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-7d668d5f-0b74-4535-a166-89784d7ca5e9
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/7d668d5f-0b74-4535-a166-89784d7ca5e9.pid.haproxy
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 7d668d5f-0b74-4535-a166-89784d7ca5e9
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:09:10 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:10.163 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'env', 'PROCESS_TAG=haproxy-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/7d668d5f-0b74-4535-a166-89784d7ca5e9.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.197 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433350.1973202, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.198 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Started (Lifecycle Event)#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.224 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.227 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433350.1975663, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.228 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.249 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.252 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:09:10 np0005554845 nova_compute[187128]: 2025-12-11 06:09:10.277 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:09:10 np0005554845 podman[217296]: 2025-12-11 06:09:10.506093201 +0000 UTC m=+0.047311924 container create fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:09:10 np0005554845 systemd[1]: Started libpod-conmon-fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86.scope.
Dec 11 01:09:10 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:09:10 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/8c2a13c0b32e19225b00f4317930be425728269bd92e9a85a8da0d01b129d447/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:09:10 np0005554845 podman[217296]: 2025-12-11 06:09:10.480298092 +0000 UTC m=+0.021516835 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:09:10 np0005554845 podman[217296]: 2025-12-11 06:09:10.576859849 +0000 UTC m=+0.118078622 container init fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:09:10 np0005554845 podman[217296]: 2025-12-11 06:09:10.581566876 +0000 UTC m=+0.122785619 container start fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:09:10 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [NOTICE]   (217315) : New worker (217317) forked
Dec 11 01:09:10 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [NOTICE]   (217315) : Loading success.
Dec 11 01:09:11 np0005554845 nova_compute[187128]: 2025-12-11 06:09:11.228 187132 DEBUG nova.network.neutron [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updated VIF entry in instance network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:09:11 np0005554845 nova_compute[187128]: 2025-12-11 06:09:11.229 187132 DEBUG nova.network.neutron [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updating instance_info_cache with network_info: [{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:11 np0005554845 nova_compute[187128]: 2025-12-11 06:09:11.242 187132 DEBUG oslo_concurrency.lockutils [req-172a703d-ee93-42c7-82bc-cfcf0aaab639 req-d9dde78b-9908-48b6-9599-3871a6f88f36 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.686 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.687 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.687 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.687 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.688 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Processing event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.688 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.688 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.688 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.689 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.689 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] No waiting events found dispatching network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.689 187132 WARNING nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received unexpected event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.689 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.689 187132 DEBUG nova.compute.manager [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing instance network info cache due to event network-changed-fb8865d1-91e3-4d6a-9437-231beabc5816. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.690 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.690 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.690 187132 DEBUG nova.network.neutron [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Refreshing network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.692 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.696 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433352.6967795, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.697 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.698 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.701 187132 INFO nova.virt.libvirt.driver [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Instance spawned successfully.#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.701 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.730 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.732 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.732 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.733 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.733 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.733 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.735 187132 INFO nova.compute.manager [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Terminating instance#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.737 187132 DEBUG nova.compute.manager [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.742 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.746 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.748 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.748 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.749 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.749 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.749 187132 DEBUG nova.virt.libvirt.driver [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:09:12 np0005554845 kernel: tapfb8865d1-91 (unregistering): left promiscuous mode
Dec 11 01:09:12 np0005554845 NetworkManager[55529]: <info>  [1765433352.7589] device (tapfb8865d1-91): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00140|binding|INFO|Releasing lport fb8865d1-91e3-4d6a-9437-231beabc5816 from this chassis (sb_readonly=0)
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00141|binding|INFO|Setting lport fb8865d1-91e3-4d6a-9437-231beabc5816 down in Southbound
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00142|binding|INFO|Removing iface tapfb8865d1-91 ovn-installed in OVS
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.769 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.775 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:49:01:48 10.100.0.11'], port_security=['fa:16:3e:49:01:48 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb93259f-17ff-4ea0-aadc-09a566a9fe40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=125d7ef9-caf5-4c07-aba6-741106b35f5b, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=fb8865d1-91e3-4d6a-9437-231beabc5816) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.777 104320 INFO neutron.agent.ovn.metadata.agent [-] Port fb8865d1-91e3-4d6a-9437-231beabc5816 in datapath 92ebde34-cbee-4b5e-ac06-7fdddcde07a5 unbound from our chassis#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.778 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 92ebde34-cbee-4b5e-ac06-7fdddcde07a5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.780 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.779 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[219e37db-fdbe-449a-a47f-298866a71052]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.780 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5 namespace which is not needed anymore#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.793 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 kernel: tap49ac0b2b-42 (unregistering): left promiscuous mode
Dec 11 01:09:12 np0005554845 NetworkManager[55529]: <info>  [1765433352.7998] device (tap49ac0b2b-42): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.812 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00143|binding|INFO|Releasing lport 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 from this chassis (sb_readonly=0)
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00144|binding|INFO|Setting lport 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 down in Southbound
Dec 11 01:09:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:12Z|00145|binding|INFO|Removing iface tap49ac0b2b-42 ovn-installed in OVS
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.815 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.821 187132 INFO nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Took 10.73 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.822 187132 DEBUG nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:12.823 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:fe:6a:cf 2001:db8::f816:3eff:fefe:6acf'], port_security=['fa:16:3e:fe:6a:cf 2001:db8::f816:3eff:fefe:6acf'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fefe:6acf/64', 'neutron:device_id': '524e0fc6-c557-4d6d-a3bf-a9af1980bf6d', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'fb93259f-17ff-4ea0-aadc-09a566a9fe40', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2bfa7fb-80ee-49db-a84d-7d408b52f281, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.830 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Deactivated successfully.
Dec 11 01:09:12 np0005554845 systemd[1]: machine-qemu\x2d6\x2dinstance\x2d0000000d.scope: Consumed 16.469s CPU time.
Dec 11 01:09:12 np0005554845 systemd-machined[153381]: Machine qemu-6-instance-0000000d terminated.
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.926 187132 INFO nova.compute.manager [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Took 11.28 seconds to build instance.#033[00m
Dec 11 01:09:12 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [NOTICE]   (216164) : haproxy version is 2.8.14-c23fe91
Dec 11 01:09:12 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [NOTICE]   (216164) : path to executable is /usr/sbin/haproxy
Dec 11 01:09:12 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [WARNING]  (216164) : Exiting Master process...
Dec 11 01:09:12 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [ALERT]    (216164) : Current worker (216166) exited with code 143 (Terminated)
Dec 11 01:09:12 np0005554845 neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5[216160]: [WARNING]  (216164) : All workers exited. Exiting... (0)
Dec 11 01:09:12 np0005554845 systemd[1]: libpod-b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195.scope: Deactivated successfully.
Dec 11 01:09:12 np0005554845 podman[217352]: 2025-12-11 06:09:12.951946729 +0000 UTC m=+0.056672486 container died b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.964 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.965 187132 DEBUG oslo_concurrency.lockutils [None req-a0d911c3-31c9-4211-8d81-84144081d1c0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.391s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:12 np0005554845 NetworkManager[55529]: <info>  [1765433352.9689] manager: (tap49ac0b2b-42): new Tun device (/org/freedesktop/NetworkManager/Devices/71)
Dec 11 01:09:12 np0005554845 nova_compute[187128]: 2025-12-11 06:09:12.978 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.007 187132 INFO nova.virt.libvirt.driver [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Instance destroyed successfully.#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.007 187132 DEBUG nova.objects.instance [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:13 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195-userdata-shm.mount: Deactivated successfully.
Dec 11 01:09:13 np0005554845 systemd[1]: var-lib-containers-storage-overlay-a1e985ae3b9633304fab789af4b5a8f284c6b811b9a1e76f24a0b0cfb839e39b-merged.mount: Deactivated successfully.
Dec 11 01:09:13 np0005554845 podman[217352]: 2025-12-11 06:09:13.02614684 +0000 UTC m=+0.130872577 container cleanup b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.029 187132 DEBUG nova.virt.libvirt.vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.030 187132 DEBUG nova.network.os_vif_util [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.181", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.031 187132 DEBUG nova.network.os_vif_util [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.031 187132 DEBUG os_vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.033 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.034 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapfb8865d1-91, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.035 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.037 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:09:13 np0005554845 systemd[1]: libpod-conmon-b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195.scope: Deactivated successfully.
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.040 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.042 187132 INFO os_vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:49:01:48,bridge_name='br-int',has_traffic_filtering=True,id=fb8865d1-91e3-4d6a-9437-231beabc5816,network=Network(92ebde34-cbee-4b5e-ac06-7fdddcde07a5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapfb8865d1-91')#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.043 187132 DEBUG nova.virt.libvirt.vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:07:38Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1336912788',display_name='tempest-TestGettingAddress-server-1336912788',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1336912788',id=13,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIEZclnOFMrexdjXuOkORcLDtA8yO6kfE4DBOUgbM3gHbfN391UlOmGhfZVKD/zGl6Gj1jPXo/jXjCKVlMACkhXE/JYda9bh6TiWiKcbIr9HyCjOcURaG9csLYvNUZORNg==',key_name='tempest-TestGettingAddress-876887647',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:07:58Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e20pu0cb',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:07:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=524e0fc6-c557-4d6d-a3bf-a9af1980bf6d,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.044 187132 DEBUG nova.network.os_vif_util [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.044 187132 DEBUG nova.network.os_vif_util [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.045 187132 DEBUG os_vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.046 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.046 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap49ac0b2b-42, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.048 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.050 187132 INFO os_vif [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:fe:6a:cf,bridge_name='br-int',has_traffic_filtering=True,id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0,network=Network(a2bcf811-4eea-465b-bdbf-ec77bd6ec91f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap49ac0b2b-42')#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.051 187132 INFO nova.virt.libvirt.driver [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Deleting instance files /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d_del#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.051 187132 INFO nova.virt.libvirt.driver [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Deletion of /var/lib/nova/instances/524e0fc6-c557-4d6d-a3bf-a9af1980bf6d_del complete#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.140 187132 INFO nova.compute.manager [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.141 187132 DEBUG oslo.service.loopingcall [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.141 187132 DEBUG nova.compute.manager [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.142 187132 DEBUG nova.network.neutron [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:09:13 np0005554845 podman[217406]: 2025-12-11 06:09:13.156012359 +0000 UTC m=+0.105803258 container remove b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.162 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[dd6a3313-7590-4ba4-9d28-b7d56f890d2b]: (4, ('Thu Dec 11 06:09:12 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5 (b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195)\nb25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195\nThu Dec 11 06:09:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5 (b25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195)\nb25cbddf38422f747146528a5ae85abd48b13c44856e01784a14682ee6158195\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.164 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d539dc37-7366-4cc9-b53e-3a2fc6e443bb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.165 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap92ebde34-c0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.167 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 kernel: tap92ebde34-c0: left promiscuous mode
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.170 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.174 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1bc90b18-24e8-46e7-b870-cda901725bae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.189 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.197 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[206bfdb4-c92b-42a2-865f-87224f3fda80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.199 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0e319f10-b2f9-44e8-9ad9-eaed91941d1e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.216 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8342981a-262a-4bec-882d-7e70eee8e5a0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354673, 'reachable_time': 21185, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217419, 'error': None, 'target': 'ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.218 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-92ebde34-cbee-4b5e-ac06-7fdddcde07a5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.219 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[9c5ae580-f54c-4e3b-8e41-d0e84210e28d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.219 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 in datapath a2bcf811-4eea-465b-bdbf-ec77bd6ec91f unbound from our chassis#033[00m
Dec 11 01:09:13 np0005554845 systemd[1]: run-netns-ovnmeta\x2d92ebde34\x2dcbee\x2d4b5e\x2dac06\x2d7fdddcde07a5.mount: Deactivated successfully.
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.221 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.221 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[77adaec1-6f0c-4d19-9b0c-cfd4c2e1871e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.222 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f namespace which is not needed anymore#033[00m
Dec 11 01:09:13 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [NOTICE]   (216244) : haproxy version is 2.8.14-c23fe91
Dec 11 01:09:13 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [NOTICE]   (216244) : path to executable is /usr/sbin/haproxy
Dec 11 01:09:13 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [WARNING]  (216244) : Exiting Master process...
Dec 11 01:09:13 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [ALERT]    (216244) : Current worker (216246) exited with code 143 (Terminated)
Dec 11 01:09:13 np0005554845 neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f[216240]: [WARNING]  (216244) : All workers exited. Exiting... (0)
Dec 11 01:09:13 np0005554845 systemd[1]: libpod-41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0.scope: Deactivated successfully.
Dec 11 01:09:13 np0005554845 podman[217437]: 2025-12-11 06:09:13.366525623 +0000 UTC m=+0.052003550 container died 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:09:13 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0-userdata-shm.mount: Deactivated successfully.
Dec 11 01:09:13 np0005554845 systemd[1]: var-lib-containers-storage-overlay-8633deebe954b1f6cab70e8110a2cd0e3300c5ea45ed287441f7eb5acbc10b9b-merged.mount: Deactivated successfully.
Dec 11 01:09:13 np0005554845 podman[217437]: 2025-12-11 06:09:13.405405637 +0000 UTC m=+0.090883554 container cleanup 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:09:13 np0005554845 systemd[1]: libpod-conmon-41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0.scope: Deactivated successfully.
Dec 11 01:09:13 np0005554845 podman[217466]: 2025-12-11 06:09:13.463315376 +0000 UTC m=+0.040082546 container remove 41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.470 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cc967854-9fd6-4c4b-8691-6e06d482eb6e]: (4, ('Thu Dec 11 06:09:13 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f (41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0)\n41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0\nThu Dec 11 06:09:13 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f (41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0)\n41dda15675a2ad7a90c015b1df4a6e732ff6b25d956045027ea5ab524c5c19f0\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.471 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b3c55fc4-d622-4d3c-bc84-c8c7a5014030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.472 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapa2bcf811-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.475 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 kernel: tapa2bcf811-40: left promiscuous mode
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.486 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.489 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.490 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[701ceedc-0acd-4eae-82d4-a01c5aa61a81]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.504 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0b0889c1-b2d0-4640-becf-f7fb246a594e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.505 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[06717a09-a3d3-4f1f-b14b-a12b3a39d511]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.526 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7b103cf8-5a7e-4195-afda-02cb6426e1cd]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 354764, 'reachable_time': 15192, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217481, 'error': None, 'target': 'ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.527 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-a2bcf811-4eea-465b-bdbf-ec77bd6ec91f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:09:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:13.528 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[5fbf3aa0-d0d0-4b22-9d0c-1bb4dc06a15a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.811 187132 DEBUG nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-unplugged-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.812 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.812 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.812 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.812 187132 DEBUG nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No waiting events found dispatching network-vif-unplugged-fb8865d1-91e3-4d6a-9437-231beabc5816 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.813 187132 DEBUG nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-unplugged-fb8865d1-91e3-4d6a-9437-231beabc5816 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.813 187132 DEBUG nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.813 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.813 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.814 187132 DEBUG oslo_concurrency.lockutils [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.814 187132 DEBUG nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No waiting events found dispatching network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:13 np0005554845 nova_compute[187128]: 2025-12-11 06:09:13.814 187132 WARNING nova.compute.manager [req-b9ce49b3-5cfc-40a0-923c-a079e0c4841f req-fba589be-af2f-476f-9c1d-884c77792324 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received unexpected event network-vif-plugged-fb8865d1-91e3-4d6a-9437-231beabc5816 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:09:14 np0005554845 systemd[1]: run-netns-ovnmeta\x2da2bcf811\x2d4eea\x2d465b\x2dbdbf\x2dec77bd6ec91f.mount: Deactivated successfully.
Dec 11 01:09:14 np0005554845 nova_compute[187128]: 2025-12-11 06:09:14.586 187132 DEBUG nova.network.neutron [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updated VIF entry in instance network info cache for port fb8865d1-91e3-4d6a-9437-231beabc5816. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:09:14 np0005554845 nova_compute[187128]: 2025-12-11 06:09:14.587 187132 DEBUG nova.network.neutron [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "address": "fa:16:3e:fe:6a:cf", "network": {"id": "a2bcf811-4eea-465b-bdbf-ec77bd6ec91f", "bridge": "br-int", "label": "tempest-network-smoke--1990849076", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fefe:6acf", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap49ac0b2b-42", "ovs_interfaceid": "49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:14 np0005554845 nova_compute[187128]: 2025-12-11 06:09:14.610 187132 DEBUG oslo_concurrency.lockutils [req-a41f405f-e7bb-4faa-b020-c0f78d1802ce req-c7ae8a8b-0235-4a62-a332-cbdf05c97ca7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:09:14 np0005554845 nova_compute[187128]: 2025-12-11 06:09:14.948 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.311 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-unplugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.312 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.312 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.313 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.313 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No waiting events found dispatching network-vif-unplugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.313 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-unplugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.313 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.314 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.314 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.314 187132 DEBUG oslo_concurrency.lockutils [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.315 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] No waiting events found dispatching network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.315 187132 WARNING nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received unexpected event network-vif-plugged-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.315 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-deleted-49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.316 187132 INFO nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Neutron deleted interface 49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.316 187132 DEBUG nova.network.neutron [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [{"id": "fb8865d1-91e3-4d6a-9437-231beabc5816", "address": "fa:16:3e:49:01:48", "network": {"id": "92ebde34-cbee-4b5e-ac06-7fdddcde07a5", "bridge": "br-int", "label": "tempest-network-smoke--1150642260", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapfb8865d1-91", "ovs_interfaceid": "fb8865d1-91e3-4d6a-9437-231beabc5816", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.350 187132 DEBUG nova.compute.manager [req-ab68ad31-926b-4303-b8ab-447c34e619ce req-7190705f-f1b7-4a52-a952-c36c1a5417f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Detach interface failed, port_id=49ac0b2b-42bb-47fb-ad03-d6d0fc457eb0, reason: Instance 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.579 187132 DEBUG nova.network.neutron [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.592 187132 INFO nova.compute.manager [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Took 2.45 seconds to deallocate network for instance.#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.643 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.643 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.713 187132 DEBUG nova.compute.provider_tree [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.736 187132 DEBUG nova.scheduler.client.report [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.769 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.797 187132 INFO nova.scheduler.client.report [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d#033[00m
Dec 11 01:09:15 np0005554845 nova_compute[187128]: 2025-12-11 06:09:15.870 187132 DEBUG oslo_concurrency.lockutils [None req-a0659e91-2c89-4cb0-a7b4-ab122c435b81 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "524e0fc6-c557-4d6d-a3bf-a9af1980bf6d" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.138s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:16 np0005554845 podman[217482]: 2025-12-11 06:09:16.135113948 +0000 UTC m=+0.062295540 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.432 187132 DEBUG nova.compute.manager [req-cee7f82a-4d9e-4491-af21-5854e0041ad8 req-04d376dc-3a9d-4ad3-a31b-47379dee2c1d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Received event network-vif-deleted-fb8865d1-91e3-4d6a-9437-231beabc5816 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.500 187132 DEBUG nova.compute.manager [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.501 187132 DEBUG nova.compute.manager [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing instance network info cache due to event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.502 187132 DEBUG oslo_concurrency.lockutils [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.502 187132 DEBUG oslo_concurrency.lockutils [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:09:17 np0005554845 nova_compute[187128]: 2025-12-11 06:09:17.503 187132 DEBUG nova.network.neutron [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:09:18 np0005554845 nova_compute[187128]: 2025-12-11 06:09:18.050 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:18 np0005554845 podman[217502]: 2025-12-11 06:09:18.13535544 +0000 UTC m=+0.068090566 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent)
Dec 11 01:09:18 np0005554845 podman[217503]: 2025-12-11 06:09:18.180646758 +0000 UTC m=+0.104553015 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:09:18 np0005554845 nova_compute[187128]: 2025-12-11 06:09:18.907 187132 DEBUG nova.network.neutron [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updated VIF entry in instance network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:09:18 np0005554845 nova_compute[187128]: 2025-12-11 06:09:18.908 187132 DEBUG nova.network.neutron [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updating instance_info_cache with network_info: [{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:18 np0005554845 nova_compute[187128]: 2025-12-11 06:09:18.940 187132 DEBUG oslo_concurrency.lockutils [req-34890649-8458-4325-983d-9fbc5dc9c587 req-0f1b7f52-3bf1-489a-9026-417ac666f5e1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:09:19 np0005554845 nova_compute[187128]: 2025-12-11 06:09:19.950 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:20 np0005554845 podman[217545]: 2025-12-11 06:09:20.119197898 +0000 UTC m=+0.056616834 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=multipathd, org.label-schema.build-date=20251202, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:09:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:20Z|00146|binding|INFO|Releasing lport 46546633-a2a1-4a03-ac9c-5c5e47374adc from this chassis (sb_readonly=0)
Dec 11 01:09:20 np0005554845 nova_compute[187128]: 2025-12-11 06:09:20.955 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.712 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.713 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.713 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.713 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.794 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.861 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json" returned: 0 in 0.067s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.862 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:21 np0005554845 nova_compute[187128]: 2025-12-11 06:09:21.928 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.096 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.098 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5530MB free_disk=73.33011627197266GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.098 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.098 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.168 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 99e00ae7-84c5-40a5-a280-10071d1df3f0 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.168 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.168 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.352 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.366 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.402 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.402 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.303s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:22 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:22.414 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=8, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=7) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:22 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:22.415 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:09:22 np0005554845 nova_compute[187128]: 2025-12-11 06:09:22.416 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:23 np0005554845 nova_compute[187128]: 2025-12-11 06:09:23.052 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:24 np0005554845 nova_compute[187128]: 2025-12-11 06:09:24.952 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:25 np0005554845 podman[217588]: 2025-12-11 06:09:25.135548666 +0000 UTC m=+0.064809887 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:09:25 np0005554845 podman[217589]: 2025-12-11 06:09:25.141552489 +0000 UTC m=+0.069591307 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, vcs-type=git, io.openshift.tags=minimal rhel9, distribution-scope=public, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.component=ubi9-minimal-container, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, release=1755695350, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, architecture=x86_64, container_name=openstack_network_exporter, io.openshift.expose-services=, vendor=Red Hat, Inc., io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc.)
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.398 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.398 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.399 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:25.417 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '8'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:25Z|00017|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:91:84:93 10.100.0.8
Dec 11 01:09:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:25Z|00018|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:91:84:93 10.100.0.8
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:09:25 np0005554845 nova_compute[187128]: 2025-12-11 06:09:25.765 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:09:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:26.221 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:26.222 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:26.223 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:26 np0005554845 nova_compute[187128]: 2025-12-11 06:09:26.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:27 np0005554845 nova_compute[187128]: 2025-12-11 06:09:27.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:27 np0005554845 nova_compute[187128]: 2025-12-11 06:09:27.690 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:09:28 np0005554845 nova_compute[187128]: 2025-12-11 06:09:28.006 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433353.0049458, 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:28 np0005554845 nova_compute[187128]: 2025-12-11 06:09:28.006 187132 INFO nova.compute.manager [-] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:09:28 np0005554845 nova_compute[187128]: 2025-12-11 06:09:28.029 187132 DEBUG nova.compute.manager [None req-6e061d3d-5cd0-4e8e-9913-3429ec640fe0 - - - - - -] [instance: 524e0fc6-c557-4d6d-a3bf-a9af1980bf6d] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:28 np0005554845 nova_compute[187128]: 2025-12-11 06:09:28.054 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:28 np0005554845 nova_compute[187128]: 2025-12-11 06:09:28.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:29 np0005554845 nova_compute[187128]: 2025-12-11 06:09:29.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:09:29 np0005554845 nova_compute[187128]: 2025-12-11 06:09:29.983 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.106 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000013', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'hostId': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.109 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 99e00ae7-84c5-40a5-a280-10071d1df3f0 / tapf3f6ac24-c6 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.109 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5f326723-7fac-4edf-b4e5-628c052fbb85', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.107674', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48158b6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': '7725f81f8ed643384f641b16933ab9758cd9f87991e51d41fb4fca86529e8ac9'}]}, 'timestamp': '2025-12-11 06:09:30.110416', '_unique_id': 'c365f9e29ada447f9902fa103329648b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.111 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.112 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.112 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.113 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>]
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.113 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.113 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6033e69d-bd8a-4778-862d-79585dbcd5b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.113340', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f481db88-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': '81bc0f54d6c4627b1dcd9fcf0c9380da21f6bc2f98c2dc1c31de8776c624f765'}]}, 'timestamp': '2025-12-11 06:09:30.113714', '_unique_id': 'dc5b320ef33a410a978f99e898aac6a4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.114 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.115 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>]
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.115 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.139 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.requests volume: 297 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.140 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8edc675d-0422-47fe-81f7-76739c918de9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 297, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.115244', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f485f286-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '0439b4519922c9ce0dec16bab91639a3f5ba9ba8f7ebe29732e34f4f56db9684'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.115244', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4860154-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '42c6c4a93117965009bbb96f42f556c1d82d26af3ca1bf8f6354282fa543804b'}]}, 'timestamp': '2025-12-11 06:09:30.140831', '_unique_id': '69bda63af16e4ae6a19ba99df9bd29bf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.143 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.143 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.bytes volume: 30206464 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.143 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3949b60-dad6-41bb-96d2-92c5f613082d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30206464, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.143085', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f486634c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '5fdc6799415a56cfb6ac84900048bd14e2238768085e30379530612f3a85b131'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.143085', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f4866bda-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '836422e2a8acdb2c6acd96de6309c2c2ec2cfb585af542b34fcaa0ca945a52c2'}]}, 'timestamp': '2025-12-11 06:09:30.143527', '_unique_id': '0dc7169a5756463d8f799155de43cfb4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.144 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.incoming.packets volume: 15 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '37c8eded-4cf9-4f3f-bd2b-6d047bde186e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 15, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.144879', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f486aa64-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': '4fac2008cf8dd7d9ab72f48f631999327b6841000620799213197e67d4637ae2'}]}, 'timestamp': '2025-12-11 06:09:30.145171', '_unique_id': 'd4f41fafb8cc4b69b1ec06e555dd1e8e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.145 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.146 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.146 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.latency volume: 2377342370 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.146 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6ee88a62-da09-4daf-9e41-dd337d326478', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2377342370, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.146570', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f486ec5e-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': 'c5b95a99e91df5aa0af3a60a0ce931e3b60cc48600e6e4e8fbaa57c9760d482a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.146570', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f486f6fe-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '20f52ad320af94ca0bf4351f1d666ba04ef4fbf0a61bc2614044c656710cbca9'}]}, 'timestamp': '2025-12-11 06:09:30.147115', '_unique_id': '1517d69b83b246fe82058c866572d404'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.147 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.148 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.148 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f5e92105-bd68-473b-a910-6096acba2cbd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.148429', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f487351a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': '6ccbaa6fc0f0cbc00576da48c074cdce33423f1c14f184f963fa7e6bc2cc38e9'}]}, 'timestamp': '2025-12-11 06:09:30.148702', '_unique_id': '9fa97c31c5c54e85a2be1f7c3ddbc65c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.149 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ea66c62-1269-4aa6-85da-cd8af6de14cb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.150068', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48774da-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': 'bc6a88dba246cd42681664d876207983d225850773e6423e511eb4630874b8f3'}]}, 'timestamp': '2025-12-11 06:09:30.150365', '_unique_id': '26862dc5cedc49bfa15123a6d8480e4a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.151 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.151 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.bytes volume: 72904704 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.151 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b0e9eaa2-16d0-4020-ba8b-06f71f75599d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72904704, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.151581', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f487afcc-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '814d8cdabbc8aa731f6bd74d2a8b08c69a846a340e02ae22536a15a08a616a40'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.151581', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f487b9cc-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': '1c226bd87d776dda25bd709b7db6e98759384bd74a1a3d60a8004b17b9afdee4'}]}, 'timestamp': '2025-12-11 06:09:30.152112', '_unique_id': '0d3471f3f4964f42b89986868c59144e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.152 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.153 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.153 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>]
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.153 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.165 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.165 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0426d5f6-c0f5-45ec-adbf-4285e91714ce', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.153826', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f489c438-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': 'cb87071a159d2287c4a1cc7672b97b7ed9295d9f281d83c67f67d3f57abadd96'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.153826', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f489d266-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': '554523577a6d769da8464f36f83f14ca584018c7a036d78aee78f05fe60c69ff'}]}, 'timestamp': '2025-12-11 06:09:30.165837', '_unique_id': '664a0d6e28404d278cec65201131340c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.166 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.167 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.180 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/cpu volume: 11130000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5a05ccc1-c0b3-44e0-91d3-94db165aadcd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11130000000, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'timestamp': '2025-12-11T06:09:30.167301', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'f48c2674-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.882407605, 'message_signature': 'd4495d4fe758580b4a11f88d55b6e5c9f1d93fe623266229d9dc14fda47c842e'}]}, 'timestamp': '2025-12-11 06:09:30.181106', '_unique_id': '2a8a45cf64d84daaa3cb7bcacd40d0c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.181 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.182 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.182 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.182 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestNetworkAdvancedServerOps-server-1508444165>]
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.182 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.182 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a2dff0ea-bd96-4af2-ac7e-68316b1b0181', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.182860', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f48c750c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': '99bbe919cb2471a5a168cef9499eb32a5a659245ab74d015514c8759057ac0e3'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.182860', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f48c7e26-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': '25f7caf971de762411d657738b66ef2e8cbc07fe8a162138ec3f605199dce538'}]}, 'timestamp': '2025-12-11 06:09:30.183321', '_unique_id': '97b6e97a05af4e278edc07af641d4a4e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.184 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd16c92a1-c403-4198-b9f5-84460b0db01d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.184577', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48cb814-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': 'b0eacf593806256c3ea79ec74e8da857156eb02c4ce5c09f946fb08cf95fc7b2'}]}, 'timestamp': '2025-12-11 06:09:30.184822', '_unique_id': '86d6f373f0dc4f2794891eab6a788d73'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.185 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.incoming.bytes volume: 1940 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9af879dc-524c-4c83-a090-6e63f2910a26', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1940, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.185908', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48ceb90-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': 'd4f7e92f5026530cac9c10cea64396a2fc0c99ca3b587e894300c356febadc30'}]}, 'timestamp': '2025-12-11 06:09:30.186148', '_unique_id': '33ff2d461203440cb83f0338169a795a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.187 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.latency volume: 212974622 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.187 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.latency volume: 21575451 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e044017c-8406-4c6f-9923-68154329e344', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 212974622, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.187294', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f48d23c6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': 'fd961dca2500be31d6797047eef95cf1eb3c8927e65ff568db8e61996cd2b475'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 21575451, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.187294', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f48d2cc2-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': 'a297c9cfb8318f6bb10b3bd69de98e1465e96cb01ba58117b2299996950da6f1'}]}, 'timestamp': '2025-12-11 06:09:30.187789', '_unique_id': '57d391140b0846f5acd08031a93b570c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.188 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/memory.usage volume: 40.44140625 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a88ba4b8-ebb3-406d-a01c-fbdf3cbfd6ee', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.44140625, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'timestamp': '2025-12-11T06:09:30.188843', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'f48d5e40-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.882407605, 'message_signature': 'c755d7cef1c4ab1cb05ffb8a3834606ea02a3515c2793841a9fb28e5e64631c9'}]}, 'timestamp': '2025-12-11 06:09:30.189061', '_unique_id': '5ed42b20f5db4f5ca11cee17e386ce89'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.190 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.190 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.190 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '434a6f83-c0fe-4e29-83e1-d82fa8120f42', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.190133', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f48d907c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': '6380d440133dedca3ab676eb9d4b8ea5374d50f6ee44fd8123424c31a26cef8d'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.190133', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f48d990a-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.855570124, 'message_signature': '543251191a07b73dde8f3900be763e00dc47113e616cee0317afaf56f72fe09d'}]}, 'timestamp': '2025-12-11 06:09:30.190561', '_unique_id': '8da2c34e06e140c39951466799717689'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.191 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.outgoing.bytes volume: 1438 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fe95d01c-c654-4e36-b445-dd11df93fe96', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1438, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.191895', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48dd582-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': 'a3f2d2ab102129f2e5901c43ca879083fc8b54e982c81576cbeeec8c56752b4c'}]}, 'timestamp': '2025-12-11 06:09:30.192120', '_unique_id': 'f287bd5b34a84dcfaf26b95ba10a50dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.193 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.requests volume: 1087 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.193 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a854cb9f-59d3-4012-920d-87609de75e77', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1087, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-vda', 'timestamp': '2025-12-11T06:09:30.193181', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'f48e078c-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': 'c85a51dd918a6ff728e087dafcb401dc680cf361c7d196d095139697d3a763e5'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0-sda', 'timestamp': '2025-12-11T06:09:30.193181', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'instance-00000013', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'f48e12ae-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.816935021, 'message_signature': 'e980873431f844b20710be300dbb2be24b0d64e5b3f09135a0038fe5754b89bc'}]}, 'timestamp': '2025-12-11 06:09:30.193683', '_unique_id': 'db5a5f2c3a48474dadce02453c56d7d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.194 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'daab7cda-610f-413e-b10a-d3daae902023', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.194793', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48e46ac-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': 'e6fc63c45bd2e5c922ed01b2e2432e0ed1df6387fc3cd006dcb3f2c63a686dbe'}]}, 'timestamp': '2025-12-11 06:09:30.195018', '_unique_id': '516ce4da45d343e8a1ba712e4a2186d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 DEBUG ceilometer.compute.pollsters [-] 99e00ae7-84c5-40a5-a280-10071d1df3f0/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '79ac0b36-d241-4119-9439-513ff078db4a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_name': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_name': None, 'resource_id': 'instance-00000013-99e00ae7-84c5-40a5-a280-10071d1df3f0-tapf3f6ac24-c6', 'timestamp': '2025-12-11T06:09:30.196076', 'resource_metadata': {'display_name': 'tempest-TestNetworkAdvancedServerOps-server-1508444165', 'name': 'tapf3f6ac24-c6', 'instance_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'instance_type': 'm1.nano', 'host': '11b8000ddaf2867acee9a5bbcbfa61dbb3338e912ff6e89af49bd33f', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:91:84:93', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tapf3f6ac24-c6'}, 'message_id': 'f48e78b6-d657-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3644.809361005, 'message_signature': '1a2bb33b1cd59319f316e1cb7db2db56ea25a4bebef997db62c81077ddd07047'}]}, 'timestamp': '2025-12-11 06:09:30.196299', '_unique_id': '0e5897c27a1042d19db2142a84bbaecd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:09:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:09:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.106 187132 INFO nova.compute.manager [None req-251866da-5ce5-47a5-9799-c69f93f12490 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Get console output#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.113 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.449 187132 INFO nova.compute.manager [None req-8e6b9702-4bff-4153-946e-11349dadb52d 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Pausing#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.450 187132 DEBUG nova.objects.instance [None req-8e6b9702-4bff-4153-946e-11349dadb52d 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'flavor' on Instance uuid 99e00ae7-84c5-40a5-a280-10071d1df3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.484 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433371.4836948, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.485 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.487 187132 DEBUG nova.compute.manager [None req-8e6b9702-4bff-4153-946e-11349dadb52d 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.524 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.528 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: pausing, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:09:31 np0005554845 nova_compute[187128]: 2025-12-11 06:09:31.555 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] During sync_power_state the instance has a pending task (pausing). Skip.#033[00m
Dec 11 01:09:33 np0005554845 nova_compute[187128]: 2025-12-11 06:09:33.057 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:34 np0005554845 nova_compute[187128]: 2025-12-11 06:09:34.986 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.747 187132 INFO nova.compute.manager [None req-f029dbb7-0e9b-4605-85b8-8ed273c229ed 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Get console output#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.751 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.926 187132 INFO nova.compute.manager [None req-17672d28-9764-4797-96f6-ef13b62504e0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Unpausing#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.927 187132 DEBUG nova.objects.instance [None req-17672d28-9764-4797-96f6-ef13b62504e0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'flavor' on Instance uuid 99e00ae7-84c5-40a5-a280-10071d1df3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.952 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433375.951967, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.953 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:09:35 np0005554845 virtqemud[186638]: argument unsupported: QEMU guest agent is not configured
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.957 187132 DEBUG nova.virt.libvirt.guest [None req-17672d28-9764-4797-96f6-ef13b62504e0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.957 187132 DEBUG nova.compute.manager [None req-17672d28-9764-4797-96f6-ef13b62504e0 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.984 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:35 np0005554845 nova_compute[187128]: 2025-12-11 06:09:35.987 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: paused, current task_state: unpausing, current DB power_state: 3, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:09:36 np0005554845 nova_compute[187128]: 2025-12-11 06:09:36.018 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] During sync_power_state the instance has a pending task (unpausing). Skip.#033[00m
Dec 11 01:09:37 np0005554845 nova_compute[187128]: 2025-12-11 06:09:37.282 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:38 np0005554845 nova_compute[187128]: 2025-12-11 06:09:38.061 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:38 np0005554845 nova_compute[187128]: 2025-12-11 06:09:38.762 187132 INFO nova.compute.manager [None req-f7e6c778-0185-4cd7-835c-62336816f816 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Get console output#033[00m
Dec 11 01:09:38 np0005554845 nova_compute[187128]: 2025-12-11 06:09:38.768 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:09:39 np0005554845 nova_compute[187128]: 2025-12-11 06:09:39.989 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 podman[217628]: 2025-12-11 06:09:40.117712983 +0000 UTC m=+0.055870663 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.128 187132 DEBUG nova.compute.manager [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.128 187132 DEBUG nova.compute.manager [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing instance network info cache due to event network-changed-f3f6ac24-c680-4caa-b2c1-d317380417d7. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.128 187132 DEBUG oslo_concurrency.lockutils [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.129 187132 DEBUG oslo_concurrency.lockutils [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.129 187132 DEBUG nova.network.neutron [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Refreshing network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.195 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.196 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.196 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.196 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.197 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.198 187132 INFO nova.compute.manager [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Terminating instance#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.199 187132 DEBUG nova.compute.manager [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:09:40 np0005554845 kernel: tapf3f6ac24-c6 (unregistering): left promiscuous mode
Dec 11 01:09:40 np0005554845 NetworkManager[55529]: <info>  [1765433380.2225] device (tapf3f6ac24-c6): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.246 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:40Z|00147|binding|INFO|Releasing lport f3f6ac24-c680-4caa-b2c1-d317380417d7 from this chassis (sb_readonly=0)
Dec 11 01:09:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:40Z|00148|binding|INFO|Setting lport f3f6ac24-c680-4caa-b2c1-d317380417d7 down in Southbound
Dec 11 01:09:40 np0005554845 ovn_controller[95428]: 2025-12-11T06:09:40Z|00149|binding|INFO|Removing iface tapf3f6ac24-c6 ovn-installed in OVS
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.251 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.256 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:91:84:93 10.100.0.8'], port_security=['fa:16:3e:91:84:93 10.100.0.8'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.8/28', 'neutron:device_id': '99e00ae7-84c5-40a5-a280-10071d1df3f0', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': '65e258bf-a170-4ed6-bb0c-cb9465ced260', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=4770d3b8-5c7f-4649-944d-65f56c7b9c25, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=f3f6ac24-c680-4caa-b2c1-d317380417d7) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.258 104320 INFO neutron.agent.ovn.metadata.agent [-] Port f3f6ac24-c680-4caa-b2c1-d317380417d7 in datapath 7d668d5f-0b74-4535-a166-89784d7ca5e9 unbound from our chassis#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.262 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 7d668d5f-0b74-4535-a166-89784d7ca5e9, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.263 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c9651dbc-3a3f-4c7b-a6da-16a7bc73a3c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.264 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9 namespace which is not needed anymore#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.265 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Deactivated successfully.
Dec 11 01:09:40 np0005554845 systemd[1]: machine-qemu\x2d9\x2dinstance\x2d00000013.scope: Consumed 12.727s CPU time.
Dec 11 01:09:40 np0005554845 systemd-machined[153381]: Machine qemu-9-instance-00000013 terminated.
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [NOTICE]   (217315) : haproxy version is 2.8.14-c23fe91
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [NOTICE]   (217315) : path to executable is /usr/sbin/haproxy
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [WARNING]  (217315) : Exiting Master process...
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [WARNING]  (217315) : Exiting Master process...
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [ALERT]    (217315) : Current worker (217317) exited with code 143 (Terminated)
Dec 11 01:09:40 np0005554845 neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9[217311]: [WARNING]  (217315) : All workers exited. Exiting... (0)
Dec 11 01:09:40 np0005554845 systemd[1]: libpod-fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86.scope: Deactivated successfully.
Dec 11 01:09:40 np0005554845 podman[217676]: 2025-12-11 06:09:40.399956851 +0000 UTC m=+0.044099423 container died fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:09:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86-userdata-shm.mount: Deactivated successfully.
Dec 11 01:09:40 np0005554845 systemd[1]: var-lib-containers-storage-overlay-8c2a13c0b32e19225b00f4317930be425728269bd92e9a85a8da0d01b129d447-merged.mount: Deactivated successfully.
Dec 11 01:09:40 np0005554845 podman[217676]: 2025-12-11 06:09:40.4421299 +0000 UTC m=+0.086272492 container cleanup fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:09:40 np0005554845 systemd[1]: libpod-conmon-fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86.scope: Deactivated successfully.
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.475 187132 INFO nova.virt.libvirt.driver [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Instance destroyed successfully.#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.476 187132 DEBUG nova.objects.instance [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid 99e00ae7-84c5-40a5-a280-10071d1df3f0 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.504 187132 DEBUG nova.virt.libvirt.vif [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:08:59Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1508444165',display_name='tempest-TestNetworkAdvancedServerOps-server-1508444165',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1508444165',id=19,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBFubs0ldAofD3ANccMMmgq8hmjh8AurFdwi5RaewoBl6Y+Vx1ron6toTcXpzNxLzsBUlrUV4uy79ncS16TnDcm8ejEkhsGMufiWD1vESOHX2y+PrpdIcJjO90kjoZ11BNw==',key_name='tempest-TestNetworkAdvancedServerOps-735374275',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:09:12Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-r0fn2m1c',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:09:36Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=99e00ae7-84c5-40a5-a280-10071d1df3f0,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.504 187132 DEBUG nova.network.os_vif_util [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.506 187132 DEBUG nova.network.os_vif_util [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.507 187132 DEBUG os_vif [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.511 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.511 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapf3f6ac24-c6, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.513 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 podman[217717]: 2025-12-11 06:09:40.518342985 +0000 UTC m=+0.047801233 container remove fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.519 187132 INFO os_vif [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:91:84:93,bridge_name='br-int',has_traffic_filtering=True,id=f3f6ac24-c680-4caa-b2c1-d317380417d7,network=Network(7d668d5f-0b74-4535-a166-89784d7ca5e9),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tapf3f6ac24-c6')#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.520 187132 INFO nova.virt.libvirt.driver [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Deleting instance files /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0_del#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.520 187132 INFO nova.virt.libvirt.driver [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Deletion of /var/lib/nova/instances/99e00ae7-84c5-40a5-a280-10071d1df3f0_del complete#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.527 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d21200b-b80c-4298-bbe3-d81468cc5607]: (4, ('Thu Dec 11 06:09:40 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9 (fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86)\nfe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86\nThu Dec 11 06:09:40 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9 (fe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86)\nfe49395eca8a39ef044d2552b2d8cc2190dfcc516cdb393dfbea97b730d81a86\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.530 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4516467c-4512-4bb2-811e-df2db3207784]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.531 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7d668d5f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:09:40 np0005554845 kernel: tap7d668d5f-00: left promiscuous mode
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.532 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.544 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.550 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f8a7a4a9-2899-42e5-a980-abfd98730b6a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.565 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[465b6466-1ef6-4b5c-8803-1c276df3280a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.566 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fa91b55f-1524-43b6-b1ed-6ef78389bfe0]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.584 187132 INFO nova.compute.manager [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.585 187132 DEBUG oslo.service.loopingcall [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.585 187132 DEBUG nova.compute.manager [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:09:40 np0005554845 nova_compute[187128]: 2025-12-11 06:09:40.585 187132 DEBUG nova.network.neutron [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.586 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c5b43693-695c-47d4-a0f2-be9a11d91a22]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 362454, 'reachable_time': 43078, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217736, 'error': None, 'target': 'ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.589 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-7d668d5f-0b74-4535-a166-89784d7ca5e9 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:09:40 np0005554845 systemd[1]: run-netns-ovnmeta\x2d7d668d5f\x2d0b74\x2d4535\x2da166\x2d89784d7ca5e9.mount: Deactivated successfully.
Dec 11 01:09:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:40.589 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[f9817d99-2916-4e26-a6d3-6b6ccf1e5edc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.099 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.146 187132 DEBUG nova.compute.manager [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-unplugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.146 187132 DEBUG oslo_concurrency.lockutils [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.146 187132 DEBUG oslo_concurrency.lockutils [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.147 187132 DEBUG oslo_concurrency.lockutils [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.147 187132 DEBUG nova.compute.manager [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] No waiting events found dispatching network-vif-unplugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.147 187132 DEBUG nova.compute.manager [req-111a89a4-7da7-45dc-8d40-0a0f818751b0 req-bc599fcb-4dba-452b-a301-2882c1124e56 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-unplugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.803 187132 DEBUG nova.network.neutron [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.873 187132 INFO nova.compute.manager [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Took 1.29 seconds to deallocate network for instance.#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.928 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.928 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:41 np0005554845 nova_compute[187128]: 2025-12-11 06:09:41.989 187132 DEBUG nova.compute.provider_tree [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.003 187132 DEBUG nova.scheduler.client.report [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.025 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.097s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.067 187132 INFO nova.scheduler.client.report [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocations for instance 99e00ae7-84c5-40a5-a280-10071d1df3f0#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.147 187132 DEBUG oslo_concurrency.lockutils [None req-cb3a0243-9967-403a-9dc2-55a7a0cb4918 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.951s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.245 187132 DEBUG nova.compute.manager [req-486e3e6b-838d-4dcc-b404-45748eebb932 req-d52b2d97-8cf2-4736-a965-eb471f4a0132 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-deleted-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.391 187132 DEBUG nova.network.neutron [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updated VIF entry in instance network info cache for port f3f6ac24-c680-4caa-b2c1-d317380417d7. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.392 187132 DEBUG nova.network.neutron [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Updating instance_info_cache with network_info: [{"id": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "address": "fa:16:3e:91:84:93", "network": {"id": "7d668d5f-0b74-4535-a166-89784d7ca5e9", "bridge": "br-int", "label": "tempest-network-smoke--869272098", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.8", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tapf3f6ac24-c6", "ovs_interfaceid": "f3f6ac24-c680-4caa-b2c1-d317380417d7", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:09:42 np0005554845 nova_compute[187128]: 2025-12-11 06:09:42.429 187132 DEBUG oslo_concurrency.lockutils [req-0b8735d0-a952-4a5e-b7a6-a11128ee49ac req-f3ba7792-2c9e-4560-b59d-f2cceda0ed60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-99e00ae7-84c5-40a5-a280-10071d1df3f0" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.261 187132 DEBUG nova.compute.manager [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.262 187132 DEBUG oslo_concurrency.lockutils [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.262 187132 DEBUG oslo_concurrency.lockutils [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.262 187132 DEBUG oslo_concurrency.lockutils [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "99e00ae7-84c5-40a5-a280-10071d1df3f0-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.263 187132 DEBUG nova.compute.manager [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] No waiting events found dispatching network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:09:43 np0005554845 nova_compute[187128]: 2025-12-11 06:09:43.263 187132 WARNING nova.compute.manager [req-2aeb5eb5-1994-4d2f-a08e-14fd69da0965 req-58e75e47-8750-40cb-a1dd-d91fb7111801 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Received unexpected event network-vif-plugged-f3f6ac24-c680-4caa-b2c1-d317380417d7 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:09:44 np0005554845 nova_compute[187128]: 2025-12-11 06:09:44.643 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:44 np0005554845 nova_compute[187128]: 2025-12-11 06:09:44.851 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:44 np0005554845 nova_compute[187128]: 2025-12-11 06:09:44.990 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:45 np0005554845 nova_compute[187128]: 2025-12-11 06:09:45.515 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:46 np0005554845 systemd[1]: virtproxyd.service: Deactivated successfully.
Dec 11 01:09:46 np0005554845 podman[217739]: 2025-12-11 06:09:46.682004655 +0000 UTC m=+0.072780193 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 01:09:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:47.667 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:a8:b0:31 2001:db8:0:1:f816:3eff:fea8:b031 2001:db8::f816:3eff:fea8:b031'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fea8:b031/64 2001:db8::f816:3eff:fea8:b031/64', 'neutron:device_id': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70995ac2-b26f-4427-9e41-8f354c5ed362, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=3872e131-c169-4394-ac83-4609db001ee7) old=Port_Binding(mac=['fa:16:3e:a8:b0:31 2001:db8::f816:3eff:fea8:b031'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fea8:b031/64', 'neutron:device_id': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:09:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:47.668 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 3872e131-c169-4394-ac83-4609db001ee7 in datapath 1e539a2e-efc5-4d88-a649-84787d0021ea updated#033[00m
Dec 11 01:09:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:47.670 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e539a2e-efc5-4d88-a649-84787d0021ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:09:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:09:47.671 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ce248fff-c9c0-410a-a8e4-335c5587d489]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:09:49 np0005554845 podman[217759]: 2025-12-11 06:09:49.141305689 +0000 UTC m=+0.076525666 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 01:09:49 np0005554845 podman[217760]: 2025-12-11 06:09:49.158322192 +0000 UTC m=+0.090872056 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ovn_controller, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 01:09:49 np0005554845 nova_compute[187128]: 2025-12-11 06:09:49.993 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:50 np0005554845 nova_compute[187128]: 2025-12-11 06:09:50.517 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:51 np0005554845 podman[217800]: 2025-12-11 06:09:51.157700389 +0000 UTC m=+0.088268465 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:09:54 np0005554845 nova_compute[187128]: 2025-12-11 06:09:54.995 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:55 np0005554845 nova_compute[187128]: 2025-12-11 06:09:55.471 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433380.4699404, 99e00ae7-84c5-40a5-a280-10071d1df3f0 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:09:55 np0005554845 nova_compute[187128]: 2025-12-11 06:09:55.471 187132 INFO nova.compute.manager [-] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:09:55 np0005554845 nova_compute[187128]: 2025-12-11 06:09:55.514 187132 DEBUG nova.compute.manager [None req-4a309285-dbdd-4716-bc96-f6d7e16d390a - - - - - -] [instance: 99e00ae7-84c5-40a5-a280-10071d1df3f0] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:09:55 np0005554845 nova_compute[187128]: 2025-12-11 06:09:55.519 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:09:56 np0005554845 podman[217820]: 2025-12-11 06:09:56.120597574 +0000 UTC m=+0.055104952 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:09:56 np0005554845 podman[217821]: 2025-12-11 06:09:56.125006904 +0000 UTC m=+0.056689996 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, name=ubi9-minimal, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, vcs-type=git, distribution-scope=public)
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.257 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.257 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.274 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.366 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.367 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.375 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.376 187132 INFO nova.compute.claims [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.494 187132 DEBUG nova.compute.provider_tree [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.509 187132 DEBUG nova.scheduler.client.report [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.532 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.165s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.533 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.596 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.596 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.615 187132 INFO nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.641 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.734 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.736 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.736 187132 INFO nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Creating image(s)#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.737 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.737 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.738 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.757 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.822 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.825 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.826 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.856 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.887 187132 DEBUG nova.policy [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.949 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:58 np0005554845 nova_compute[187128]: 2025-12-11 06:09:58.951 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.000 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk 1073741824" returned: 0 in 0.049s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.002 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.002 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.066 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.067 187132 DEBUG nova.virt.disk.api [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.067 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.157 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.158 187132 DEBUG nova.virt.disk.api [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.158 187132 DEBUG nova.objects.instance [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid 182af6cf-b56e-4c6a-aeb5-092944f1745a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.178 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.178 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Ensure instance console log exists: /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.179 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.179 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.179 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.930 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Successfully created port: 4548537f-6484-4703-a9a0-4975e2aa784b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:09:59 np0005554845 nova_compute[187128]: 2025-12-11 06:09:59.997 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:00 np0005554845 nova_compute[187128]: 2025-12-11 06:10:00.521 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:00 np0005554845 nova_compute[187128]: 2025-12-11 06:10:00.883 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Successfully updated port: 4548537f-6484-4703-a9a0-4975e2aa784b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:10:00 np0005554845 nova_compute[187128]: 2025-12-11 06:10:00.897 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:00 np0005554845 nova_compute[187128]: 2025-12-11 06:10:00.897 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:00 np0005554845 nova_compute[187128]: 2025-12-11 06:10:00.897 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:10:01 np0005554845 nova_compute[187128]: 2025-12-11 06:10:01.055 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:10:02 np0005554845 nova_compute[187128]: 2025-12-11 06:10:02.376 187132 DEBUG nova.compute.manager [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:02 np0005554845 nova_compute[187128]: 2025-12-11 06:10:02.376 187132 DEBUG nova.compute.manager [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing instance network info cache due to event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:10:02 np0005554845 nova_compute[187128]: 2025-12-11 06:10:02.376 187132 DEBUG oslo_concurrency.lockutils [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.488 187132 DEBUG nova.network.neutron [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.506 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.506 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance network_info: |[{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.507 187132 DEBUG oslo_concurrency.lockutils [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.507 187132 DEBUG nova.network.neutron [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.510 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Start _get_guest_xml network_info=[{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.514 187132 WARNING nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.520 187132 DEBUG nova.virt.libvirt.host [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.521 187132 DEBUG nova.virt.libvirt.host [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.525 187132 DEBUG nova.virt.libvirt.host [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.525 187132 DEBUG nova.virt.libvirt.host [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.526 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.526 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.527 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.527 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.527 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.527 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.528 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.528 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.528 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.529 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.529 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.529 187132 DEBUG nova.virt.hardware [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.533 187132 DEBUG nova.virt.libvirt.vif [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1684014384',display_name='tempest-TestNetworkAdvancedServerOps-server-1684014384',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1684014384',id=22,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZn2r+4XJ+BXVtMgu99zp7c2YbMyuHNcWOcnaOXRTzY0GtIyqBDnE+K2336Ko+1tdZUpzJeFdbHNec8NIxnOzc6MVsnUK9kDOH2YZAfybhw/CgYHrjVTBGZLsW2tYlTcQ==',key_name='tempest-TestNetworkAdvancedServerOps-241632790',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-63kpzom1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:09:58Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=182af6cf-b56e-4c6a-aeb5-092944f1745a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.533 187132 DEBUG nova.network.os_vif_util [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.534 187132 DEBUG nova.network.os_vif_util [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.535 187132 DEBUG nova.objects.instance [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid 182af6cf-b56e-4c6a-aeb5-092944f1745a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.551 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <uuid>182af6cf-b56e-4c6a-aeb5-092944f1745a</uuid>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <name>instance-00000016</name>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1684014384</nova:name>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:10:04</nova:creationTime>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        <nova:port uuid="4548537f-6484-4703-a9a0-4975e2aa784b">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.7" ipVersion="4"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="serial">182af6cf-b56e-4c6a-aeb5-092944f1745a</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="uuid">182af6cf-b56e-4c6a-aeb5-092944f1745a</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.config"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:ed:3d:67"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <target dev="tap4548537f-64"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/console.log" append="off"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:10:04 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:10:04 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:10:04 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:10:04 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.552 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Preparing to wait for external event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.552 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.552 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.552 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.553 187132 DEBUG nova.virt.libvirt.vif [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1684014384',display_name='tempest-TestNetworkAdvancedServerOps-server-1684014384',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1684014384',id=22,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZn2r+4XJ+BXVtMgu99zp7c2YbMyuHNcWOcnaOXRTzY0GtIyqBDnE+K2336Ko+1tdZUpzJeFdbHNec8NIxnOzc6MVsnUK9kDOH2YZAfybhw/CgYHrjVTBGZLsW2tYlTcQ==',key_name='tempest-TestNetworkAdvancedServerOps-241632790',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-63kpzom1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:09:58Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=182af6cf-b56e-4c6a-aeb5-092944f1745a,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.553 187132 DEBUG nova.network.os_vif_util [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.554 187132 DEBUG nova.network.os_vif_util [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.554 187132 DEBUG os_vif [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.554 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.555 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.555 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.557 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.557 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4548537f-64, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.557 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4548537f-64, col_values=(('external_ids', {'iface-id': '4548537f-6484-4703-a9a0-4975e2aa784b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:ed:3d:67', 'vm-uuid': '182af6cf-b56e-4c6a-aeb5-092944f1745a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.559 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:04 np0005554845 NetworkManager[55529]: <info>  [1765433404.5604] manager: (tap4548537f-64): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/72)
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.561 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.566 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.568 187132 INFO os_vif [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64')#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.625 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.626 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.626 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:ed:3d:67, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.627 187132 INFO nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Using config drive#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.940 187132 INFO nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Creating config drive at /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.config#033[00m
Dec 11 01:10:04 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.945 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzgfmi2__ execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:04.999 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.070 187132 DEBUG oslo_concurrency.processutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpzgfmi2__" returned: 0 in 0.124s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:05 np0005554845 kernel: tap4548537f-64: entered promiscuous mode
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.1518] manager: (tap4548537f-64): new Tun device (/org/freedesktop/NetworkManager/Devices/73)
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.152 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:05Z|00150|binding|INFO|Claiming lport 4548537f-6484-4703-a9a0-4975e2aa784b for this chassis.
Dec 11 01:10:05 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:05Z|00151|binding|INFO|4548537f-6484-4703-a9a0-4975e2aa784b: Claiming fa:16:3e:ed:3d:67 10.100.0.7
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.157 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.162 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.169 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:3d:67 10.100.0.7'], port_security=['fa:16:3e:ed:3d:67 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '182af6cf-b56e-4c6a-aeb5-092944f1745a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5914354e-3ed3-47fd-a912-9c7227988a8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b2a27f29-0456-4cbd-bd3d-dddfa9586d24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2caf243-c0ee-48b7-bf4e-5d1ee61c0e28, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4548537f-6484-4703-a9a0-4975e2aa784b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.170 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4548537f-6484-4703-a9a0-4975e2aa784b in datapath 5914354e-3ed3-47fd-a912-9c7227988a8d bound to our chassis#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.171 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5914354e-3ed3-47fd-a912-9c7227988a8d#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.186 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0f2acc5b-15b0-4c76-b008-d8abea53be30]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.187 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5914354e-31 in ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:10:05 np0005554845 systemd-udevd[217898]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.189 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5914354e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.189 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ed8ae14c-50b6-4cdc-b812-9d4226380129]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.190 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[42544d11-be56-4ac4-af9d-61229498ac7b]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.2016] device (tap4548537f-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.2030] device (tap4548537f-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:10:05 np0005554845 systemd-machined[153381]: New machine qemu-10-instance-00000016.
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.203 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[55fc6d8e-0f3a-4e37-be05-b1c301227064]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.217 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:05Z|00152|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b ovn-installed in OVS
Dec 11 01:10:05 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:05Z|00153|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b up in Southbound
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.220 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e903391b-3da5-4928-a43c-ddc51b26441f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 systemd[1]: Started Virtual Machine qemu-10-instance-00000016.
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.222 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.248 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[1e682677-d98b-4839-8516-4bd890f9f40f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.253 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b5b2fa03-76e4-4ced-8492-2c1a61ffc802]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.2556] manager: (tap5914354e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/74)
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.288 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[02d95989-b46e-41ba-a2ce-bc3317188a91]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.292 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[18d7d846-180d-4cc9-926d-80a5002d7209]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.3156] device (tap5914354e-30): carrier: link connected
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.321 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f8692362-1d67-42f0-9211-606ca51786b3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.340 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9b140a0c-90a9-48d6-be55-246df48be97a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5914354e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:87:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 2, 'rx_bytes': 90, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367996, 'reachable_time': 31552, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 217931, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.355 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[736298c1-106d-46c4-8bec-ecef26fbd385]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:878a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 367996, 'tstamp': 367996}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 217932, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.374 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ddea40ff-55e1-42f0-91ee-896037ecc417]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5914354e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:87:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 2, 'tx_packets': 2, 'rx_bytes': 176, 'tx_bytes': 176, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 43], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367996, 'reachable_time': 31552, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 2, 'inoctets': 148, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 2, 'outoctets': 148, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 2, 'outmcastpkts': 2, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 148, 'outmcastoctets': 148, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 2, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 2, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 217933, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.409 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3ee33807-b450-4af4-ab75-64f30a9256c4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.470 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c2d429eb-b516-4387-99a6-4bdfe360e5c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.474 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5914354e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.474 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.475 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5914354e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:05 np0005554845 NetworkManager[55529]: <info>  [1765433405.4778] manager: (tap5914354e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/75)
Dec 11 01:10:05 np0005554845 kernel: tap5914354e-30: entered promiscuous mode
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.477 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.481 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5914354e-30, col_values=(('external_ids', {'iface-id': 'ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:05 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:05Z|00154|binding|INFO|Releasing lport ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5 from this chassis (sb_readonly=0)
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.482 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.484 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.492 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f6a4fc48-737a-4de8-b2b8-ce780054a699]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.493 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.494 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-5914354e-3ed3-47fd-a912-9c7227988a8d
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 5914354e-3ed3-47fd-a912-9c7227988a8d
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:10:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:05.494 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'env', 'PROCESS_TAG=haproxy-5914354e-3ed3-47fd-a912-9c7227988a8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5914354e-3ed3-47fd-a912-9c7227988a8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.596 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433405.5965538, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.597 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Started (Lifecycle Event)#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.616 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.619 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433405.5967252, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.619 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.641 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.644 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:05 np0005554845 nova_compute[187128]: 2025-12-11 06:10:05.662 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:10:05 np0005554845 podman[217971]: 2025-12-11 06:10:05.860430128 +0000 UTC m=+0.047600128 container create 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:10:05 np0005554845 systemd[1]: Started libpod-conmon-0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49.scope.
Dec 11 01:10:05 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:10:05 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/951b7c7ae85ab273f3dec749dbf34d360fefc17400ee96ae1817442578031d1b/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:10:05 np0005554845 podman[217971]: 2025-12-11 06:10:05.835466358 +0000 UTC m=+0.022636368 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:10:05 np0005554845 podman[217971]: 2025-12-11 06:10:05.941066965 +0000 UTC m=+0.128237065 container init 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 11 01:10:05 np0005554845 podman[217971]: 2025-12-11 06:10:05.947268733 +0000 UTC m=+0.134438763 container start 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 11 01:10:05 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [NOTICE]   (217990) : New worker (217992) forked
Dec 11 01:10:05 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [NOTICE]   (217990) : Loading success.
Dec 11 01:10:08 np0005554845 nova_compute[187128]: 2025-12-11 06:10:08.307 187132 DEBUG nova.network.neutron [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updated VIF entry in instance network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:10:08 np0005554845 nova_compute[187128]: 2025-12-11 06:10:08.308 187132 DEBUG nova.network.neutron [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:08 np0005554845 nova_compute[187128]: 2025-12-11 06:10:08.341 187132 DEBUG oslo_concurrency.lockutils [req-39e2a422-f9b1-4513-b292-9f628b85d507 req-f7376214-671c-45c0-aa2c-a6301429dca6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:09 np0005554845 nova_compute[187128]: 2025-12-11 06:10:09.560 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:10 np0005554845 nova_compute[187128]: 2025-12-11 06:10:10.001 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:11 np0005554845 podman[218002]: 2025-12-11 06:10:11.126802449 +0000 UTC m=+0.063691835 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.409 187132 DEBUG nova.compute.manager [req-e1417711-cc65-471c-920b-8a4d3e91b26d req-19279099-4142-4678-958a-63b8b1c249cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.409 187132 DEBUG oslo_concurrency.lockutils [req-e1417711-cc65-471c-920b-8a4d3e91b26d req-19279099-4142-4678-958a-63b8b1c249cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.409 187132 DEBUG oslo_concurrency.lockutils [req-e1417711-cc65-471c-920b-8a4d3e91b26d req-19279099-4142-4678-958a-63b8b1c249cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.410 187132 DEBUG oslo_concurrency.lockutils [req-e1417711-cc65-471c-920b-8a4d3e91b26d req-19279099-4142-4678-958a-63b8b1c249cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.410 187132 DEBUG nova.compute.manager [req-e1417711-cc65-471c-920b-8a4d3e91b26d req-19279099-4142-4678-958a-63b8b1c249cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Processing event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.411 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance event wait completed in 8 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.414 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433414.4139705, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.414 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.416 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.419 187132 INFO nova.virt.libvirt.driver [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance spawned successfully.#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.419 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.447 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.451 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.454 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.454 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.454 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.455 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.455 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.456 187132 DEBUG nova.virt.libvirt.driver [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.491 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.525 187132 INFO nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Took 15.79 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.526 187132 DEBUG nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.563 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.604 187132 INFO nova.compute.manager [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Took 16.27 seconds to build instance.#033[00m
Dec 11 01:10:14 np0005554845 nova_compute[187128]: 2025-12-11 06:10:14.627 187132 DEBUG oslo_concurrency.lockutils [None req-3dbdfa1b-dcff-43bb-8b64-3d6d9f177855 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.370s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:15 np0005554845 nova_compute[187128]: 2025-12-11 06:10:15.002 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.718 187132 DEBUG nova.compute.manager [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.719 187132 DEBUG oslo_concurrency.lockutils [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.719 187132 DEBUG oslo_concurrency.lockutils [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.720 187132 DEBUG oslo_concurrency.lockutils [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.720 187132 DEBUG nova.compute.manager [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:16 np0005554845 nova_compute[187128]: 2025-12-11 06:10:16.720 187132 WARNING nova.compute.manager [req-697fdf58-ebc5-494b-b26f-baffcd71ea95 req-cb0c90ef-71e8-41de-b712-85768fd5797d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:17 np0005554845 podman[218027]: 2025-12-11 06:10:17.135242192 +0000 UTC m=+0.066135773 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, tcib_managed=true)
Dec 11 01:10:17 np0005554845 NetworkManager[55529]: <info>  [1765433417.9663] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/76)
Dec 11 01:10:17 np0005554845 NetworkManager[55529]: <info>  [1765433417.9671] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/77)
Dec 11 01:10:17 np0005554845 nova_compute[187128]: 2025-12-11 06:10:17.966 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:18 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:18Z|00155|binding|INFO|Releasing lport ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5 from this chassis (sb_readonly=0)
Dec 11 01:10:18 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:18Z|00156|binding|INFO|Releasing lport ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5 from this chassis (sb_readonly=0)
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.123 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.941 187132 DEBUG nova.compute.manager [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.942 187132 DEBUG nova.compute.manager [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing instance network info cache due to event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.942 187132 DEBUG oslo_concurrency.lockutils [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.942 187132 DEBUG oslo_concurrency.lockutils [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:18 np0005554845 nova_compute[187128]: 2025-12-11 06:10:18.943 187132 DEBUG nova.network.neutron [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:10:19 np0005554845 nova_compute[187128]: 2025-12-11 06:10:19.565 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:20 np0005554845 nova_compute[187128]: 2025-12-11 06:10:20.003 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:20 np0005554845 podman[218049]: 2025-12-11 06:10:20.11571308 +0000 UTC m=+0.043730692 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 11 01:10:20 np0005554845 podman[218050]: 2025-12-11 06:10:20.15646618 +0000 UTC m=+0.080945885 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, tcib_managed=true, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:10:20 np0005554845 nova_compute[187128]: 2025-12-11 06:10:20.214 187132 DEBUG nova.network.neutron [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updated VIF entry in instance network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:10:20 np0005554845 nova_compute[187128]: 2025-12-11 06:10:20.214 187132 DEBUG nova.network.neutron [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:20 np0005554845 nova_compute[187128]: 2025-12-11 06:10:20.235 187132 DEBUG oslo_concurrency.lockutils [req-4399f4be-1f92-46df-92b7-1163a6434cf0 req-54abebd5-b54f-4c15-a98e-2b6b3800d20e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:22 np0005554845 podman[218094]: 2025-12-11 06:10:22.16841608 +0000 UTC m=+0.100628692 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.714 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.715 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.797 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.863 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.865 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:23 np0005554845 nova_compute[187128]: 2025-12-11 06:10:23.925 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.070 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.071 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5564MB free_disk=73.3297233581543GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.072 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.072 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.159 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 182af6cf-b56e-4c6a-aeb5-092944f1745a actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.160 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.160 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.178 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.202 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.202 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.220 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.250 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.286 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.301 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.323 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.325 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.253s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:24 np0005554845 nova_compute[187128]: 2025-12-11 06:10:24.567 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:25 np0005554845 nova_compute[187128]: 2025-12-11 06:10:25.048 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:26.223 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:26.223 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:26.224 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:26 np0005554845 nova_compute[187128]: 2025-12-11 06:10:26.321 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:26 np0005554845 nova_compute[187128]: 2025-12-11 06:10:26.322 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:26 np0005554845 nova_compute[187128]: 2025-12-11 06:10:26.322 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:10:26 np0005554845 nova_compute[187128]: 2025-12-11 06:10:26.322 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:10:27 np0005554845 podman[218134]: 2025-12-11 06:10:27.139196221 +0000 UTC m=+0.057580230 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:10:27 np0005554845 podman[218135]: 2025-12-11 06:10:27.180976308 +0000 UTC m=+0.093375223 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, release=1755695350, io.openshift.expose-services=, vcs-type=git, architecture=x86_64, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 01:10:27 np0005554845 nova_compute[187128]: 2025-12-11 06:10:27.192 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:27 np0005554845 nova_compute[187128]: 2025-12-11 06:10:27.193 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:27 np0005554845 nova_compute[187128]: 2025-12-11 06:10:27.193 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:10:27 np0005554845 nova_compute[187128]: 2025-12-11 06:10:27.193 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 182af6cf-b56e-4c6a-aeb5-092944f1745a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:10:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:27Z|00019|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:ed:3d:67 10.100.0.7
Dec 11 01:10:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:27Z|00020|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:3d:67 10.100.0.7
Dec 11 01:10:29 np0005554845 nova_compute[187128]: 2025-12-11 06:10:29.571 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.050 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.326 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.343 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.343 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.344 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.344 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.344 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.345 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.345 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:30 np0005554845 nova_compute[187128]: 2025-12-11 06:10:30.720 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:31 np0005554845 nova_compute[187128]: 2025-12-11 06:10:31.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:10:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:34.486 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=9, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=8) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:34 np0005554845 nova_compute[187128]: 2025-12-11 06:10:34.486 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:34.487 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:10:34 np0005554845 nova_compute[187128]: 2025-12-11 06:10:34.573 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:35 np0005554845 nova_compute[187128]: 2025-12-11 06:10:35.142 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:35 np0005554845 nova_compute[187128]: 2025-12-11 06:10:35.817 187132 INFO nova.compute.manager [None req-2dc040d4-d669-47d1-b5dd-2ce9200919fd 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Get console output#033[00m
Dec 11 01:10:35 np0005554845 nova_compute[187128]: 2025-12-11 06:10:35.822 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.102 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.103 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" acquired by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.103 187132 INFO nova.compute.manager [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Rebooting instance#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.126 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.127 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:36 np0005554845 nova_compute[187128]: 2025-12-11 06:10:36.127 187132 DEBUG nova.network.neutron [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:10:39 np0005554845 nova_compute[187128]: 2025-12-11 06:10:39.575 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:40 np0005554845 nova_compute[187128]: 2025-12-11 06:10:40.145 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:41 np0005554845 nova_compute[187128]: 2025-12-11 06:10:41.003 187132 DEBUG nova.network.neutron [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:41 np0005554845 nova_compute[187128]: 2025-12-11 06:10:41.019 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:41 np0005554845 nova_compute[187128]: 2025-12-11 06:10:41.021 187132 DEBUG nova.compute.manager [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:42 np0005554845 podman[218181]: 2025-12-11 06:10:42.156014684 +0000 UTC m=+0.076054633 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:10:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:42.490 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '9'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:42 np0005554845 nova_compute[187128]: 2025-12-11 06:10:42.906 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:42 np0005554845 nova_compute[187128]: 2025-12-11 06:10:42.907 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:42 np0005554845 nova_compute[187128]: 2025-12-11 06:10:42.941 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.068 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.069 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.075 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.076 187132 INFO nova.compute.claims [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.208 187132 DEBUG nova.compute.provider_tree [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.225 187132 DEBUG nova.scheduler.client.report [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.263 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.194s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.264 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.317 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.318 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.348 187132 INFO nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:10:43 np0005554845 kernel: tap4548537f-64 (unregistering): left promiscuous mode
Dec 11 01:10:43 np0005554845 NetworkManager[55529]: <info>  [1765433443.3704] device (tap4548537f-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.372 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.376 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:43Z|00157|binding|INFO|Releasing lport 4548537f-6484-4703-a9a0-4975e2aa784b from this chassis (sb_readonly=0)
Dec 11 01:10:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:43Z|00158|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b down in Southbound
Dec 11 01:10:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:43Z|00159|binding|INFO|Removing iface tap4548537f-64 ovn-installed in OVS
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.379 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.384 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:3d:67 10.100.0.7'], port_security=['fa:16:3e:ed:3d:67 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '182af6cf-b56e-4c6a-aeb5-092944f1745a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5914354e-3ed3-47fd-a912-9c7227988a8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b2a27f29-0456-4cbd-bd3d-dddfa9586d24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2caf243-c0ee-48b7-bf4e-5d1ee61c0e28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4548537f-6484-4703-a9a0-4975e2aa784b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.385 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4548537f-6484-4703-a9a0-4975e2aa784b in datapath 5914354e-3ed3-47fd-a912-9c7227988a8d unbound from our chassis#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.387 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5914354e-3ed3-47fd-a912-9c7227988a8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.387 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[71fc9147-f8f2-44f9-8b05-f8f39a83a6e2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.388 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d namespace which is not needed anymore#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.392 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:43 np0005554845 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 11 01:10:43 np0005554845 systemd[1]: machine-qemu\x2d10\x2dinstance\x2d00000016.scope: Consumed 13.497s CPU time.
Dec 11 01:10:43 np0005554845 systemd-machined[153381]: Machine qemu-10-instance-00000016 terminated.
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.468 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.469 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.470 187132 INFO nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Creating image(s)#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.471 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.471 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.472 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.490 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:43 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [NOTICE]   (217990) : haproxy version is 2.8.14-c23fe91
Dec 11 01:10:43 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [NOTICE]   (217990) : path to executable is /usr/sbin/haproxy
Dec 11 01:10:43 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [WARNING]  (217990) : Exiting Master process...
Dec 11 01:10:43 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [ALERT]    (217990) : Current worker (217992) exited with code 143 (Terminated)
Dec 11 01:10:43 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[217986]: [WARNING]  (217990) : All workers exited. Exiting... (0)
Dec 11 01:10:43 np0005554845 systemd[1]: libpod-0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49.scope: Deactivated successfully.
Dec 11 01:10:43 np0005554845 podman[218229]: 2025-12-11 06:10:43.520887269 +0000 UTC m=+0.049298315 container died 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 11 01:10:43 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49-userdata-shm.mount: Deactivated successfully.
Dec 11 01:10:43 np0005554845 systemd[1]: var-lib-containers-storage-overlay-951b7c7ae85ab273f3dec749dbf34d360fefc17400ee96ae1817442578031d1b-merged.mount: Deactivated successfully.
Dec 11 01:10:43 np0005554845 podman[218229]: 2025-12-11 06:10:43.551498542 +0000 UTC m=+0.079909588 container cleanup 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.560 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.561 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.562 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:43 np0005554845 systemd[1]: libpod-conmon-0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49.scope: Deactivated successfully.
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.574 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:43 np0005554845 podman[218259]: 2025-12-11 06:10:43.607675823 +0000 UTC m=+0.039405255 container remove 0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.611 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1b3a48-3047-4be7-9286-e041185ae078]: (4, ('Thu Dec 11 06:10:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d (0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49)\n0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49\nThu Dec 11 06:10:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d (0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49)\n0e6867f951fc3f6aff8137805001f00be310476fbe4b5ea896b88ff3de01ae49\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.613 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9a9d3a7c-06d2-47f2-856b-e81bea509d54]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.614 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5914354e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:43 np0005554845 kernel: tap5914354e-30: left promiscuous mode
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.615 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.630 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.631 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.633 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a8b1474f-1d19-41f0-acaa-a15d0f692d47]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.647 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.648 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[edce1518-2db2-45b9-9d0f-11b3fc9be8f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.649 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[17706b2a-e968-4784-91e5-e6573bb0b4ca]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.664 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[70544c16-6ea2-474b-9520-2c1f4613ff80]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 367988, 'reachable_time': 34659, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218301, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.666 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:10:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:43.666 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[8ebc502c-26f6-40d0-ad57-380ac5e61485]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:43 np0005554845 systemd[1]: run-netns-ovnmeta\x2d5914354e\x2d3ed3\x2d47fd\x2da912\x2d9c7227988a8d.mount: Deactivated successfully.
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.674 187132 DEBUG nova.policy [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.722 187132 DEBUG nova.compute.manager [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.722 187132 DEBUG oslo_concurrency.lockutils [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.722 187132 DEBUG oslo_concurrency.lockutils [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.723 187132 DEBUG oslo_concurrency.lockutils [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.723 187132 DEBUG nova.compute.manager [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.723 187132 WARNING nova.compute.manager [req-840ad5b5-61b0-4caa-b874-6e9b14df64c0 req-9cc951f1-3b51-4d86-9613-7107d76f71a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state reboot_started.#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.933 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk 1073741824" returned: 0 in 0.302s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.933 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.372s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.934 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.997 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.998 187132 DEBUG nova.virt.disk.api [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:10:43 np0005554845 nova_compute[187128]: 2025-12-11 06:10:43.999 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.088 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk --force-share --output=json" returned: 0 in 0.089s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.089 187132 DEBUG nova.virt.disk.api [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.090 187132 DEBUG nova.objects.instance [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 12be6919-2546-4cdc-9e86-d73c99aaad0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.117 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.118 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Ensure instance console log exists: /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.118 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.119 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.119 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.151 187132 INFO nova.virt.libvirt.driver [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance shutdown successfully.#033[00m
Dec 11 01:10:44 np0005554845 kernel: tap4548537f-64: entered promiscuous mode
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.2151] manager: (tap4548537f-64): new Tun device (/org/freedesktop/NetworkManager/Devices/78)
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.214 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 systemd-udevd[218210]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:10:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:44Z|00160|binding|INFO|Claiming lport 4548537f-6484-4703-a9a0-4975e2aa784b for this chassis.
Dec 11 01:10:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:44Z|00161|binding|INFO|4548537f-6484-4703-a9a0-4975e2aa784b: Claiming fa:16:3e:ed:3d:67 10.100.0.7
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.222 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:3d:67 10.100.0.7'], port_security=['fa:16:3e:ed:3d:67 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '182af6cf-b56e-4c6a-aeb5-092944f1745a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5914354e-3ed3-47fd-a912-9c7227988a8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b2a27f29-0456-4cbd-bd3d-dddfa9586d24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.201'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2caf243-c0ee-48b7-bf4e-5d1ee61c0e28, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4548537f-6484-4703-a9a0-4975e2aa784b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.224 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4548537f-6484-4703-a9a0-4975e2aa784b in datapath 5914354e-3ed3-47fd-a912-9c7227988a8d bound to our chassis#033[00m
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.2278] device (tap4548537f-64): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.227 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5914354e-3ed3-47fd-a912-9c7227988a8d#033[00m
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.2292] device (tap4548537f-64): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:10:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:44Z|00162|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b ovn-installed in OVS
Dec 11 01:10:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:44Z|00163|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b up in Southbound
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.235 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.241 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f086c381-f46a-40e6-8c2e-2f290e4df668]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.241 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5914354e-31 in ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.243 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5914354e-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.244 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ed9d3722-206b-4a69-b893-9e42d238370e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.244 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6ffa54db-c0a4-4d94-9a3d-91993e3a609a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 systemd-machined[153381]: New machine qemu-11-instance-00000016.
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.257 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[c64e1e35-a0f5-4b9a-8fbb-e269df91a4e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 systemd[1]: Started Virtual Machine qemu-11-instance-00000016.
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.282 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[eaff8176-951d-4fb0-8faf-16ae4a4b4ba6]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.311 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e2ff96-b4d6-4743-9a68-62f83175c7d8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.3185] manager: (tap5914354e-30): new Veth device (/org/freedesktop/NetworkManager/Devices/79)
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.319 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[884b83de-7e81-4c1b-942a-864fb39d2293]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.353 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[fe80f9c8-a061-41cc-bb69-095ecb9f00a2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.357 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[283d1355-7c48-4494-871d-3b239c36553b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.3793] device (tap5914354e-30): carrier: link connected
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.387 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[81c702a0-c9b5-45a9-be2d-ea5a27dca420]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.408 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2cf70042-89b1-41dd-89b6-650d2ac165bb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5914354e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:87:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371902, 'reachable_time': 18435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218354, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.429 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1b68960a-9450-4f96-9b71-316bbd3fec9e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe25:878a'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 371902, 'tstamp': 371902}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218355, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.449 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a1c71b84-619e-4d79-a9be-8adab3670e3d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5914354e-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:25:87:8a'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 46], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371902, 'reachable_time': 18435, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218356, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.492 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1193b7e1-eb4f-4203-880b-cbd89a98f7e3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.577 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.581 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d921aef-8885-43fd-b24c-0da91a2c91a9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.583 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5914354e-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.583 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.584 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5914354e-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:44 np0005554845 NetworkManager[55529]: <info>  [1765433444.5873] manager: (tap5914354e-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/80)
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.586 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 kernel: tap5914354e-30: entered promiscuous mode
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.591 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.592 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5914354e-30, col_values=(('external_ids', {'iface-id': 'ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:44 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:44Z|00164|binding|INFO|Releasing lport ad39ed30-f7d2-4c21-b6a9-3089fd6fd1a5 from this chassis (sb_readonly=0)
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.593 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.616 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.619 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.619 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.621 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c59d1572-2e53-44e8-9196-895f2257c26a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.622 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-5914354e-3ed3-47fd-a912-9c7227988a8d
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/5914354e-3ed3-47fd-a912-9c7227988a8d.pid.haproxy
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 5914354e-3ed3-47fd-a912-9c7227988a8d
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:10:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:44.623 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'env', 'PROCESS_TAG=haproxy-5914354e-3ed3-47fd-a912-9c7227988a8d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5914354e-3ed3-47fd-a912-9c7227988a8d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.712 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Removed pending event for 182af6cf-b56e-4c6a-aeb5-092944f1745a due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.713 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433444.7120628, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.714 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.721 187132 INFO nova.virt.libvirt.driver [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance running successfully.#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.722 187132 INFO nova.virt.libvirt.driver [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance soft rebooted successfully.#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.723 187132 DEBUG nova.compute.manager [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.749 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.755 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: reboot_started, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.794 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] During sync_power_state the instance has a pending task (reboot_started). Skip.#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.794 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433444.7135358, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.795 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Started (Lifecycle Event)#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.810 187132 DEBUG oslo_concurrency.lockutils [None req-4c546848-9b9e-45f9-9dfb-cfe392b5908e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" "released" by "nova.compute.manager.ComputeManager.reboot_instance.<locals>.do_reboot_instance" :: held 8.707s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.833 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:44 np0005554845 nova_compute[187128]: 2025-12-11 06:10:44.838 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: None, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:45 np0005554845 podman[218395]: 2025-12-11 06:10:45.02328694 +0000 UTC m=+0.056397478 container create 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:10:45 np0005554845 systemd[1]: Started libpod-conmon-741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be.scope.
Dec 11 01:10:45 np0005554845 podman[218395]: 2025-12-11 06:10:44.990566258 +0000 UTC m=+0.023676816 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:10:45 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:10:45 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bf707fef1c79443ffb661b819a4aea9cb0fd5754d6dd4abc4e1d22d38ca81ab7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:10:45 np0005554845 podman[218395]: 2025-12-11 06:10:45.118304797 +0000 UTC m=+0.151415335 container init 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:10:45 np0005554845 podman[218395]: 2025-12-11 06:10:45.126247163 +0000 UTC m=+0.159357701 container start 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS)
Dec 11 01:10:45 np0005554845 nova_compute[187128]: 2025-12-11 06:10:45.147 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:45 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [NOTICE]   (218414) : New worker (218416) forked
Dec 11 01:10:45 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [NOTICE]   (218414) : Loading success.
Dec 11 01:10:45 np0005554845 nova_compute[187128]: 2025-12-11 06:10:45.457 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Successfully created port: 0c0854bd-fd12-4869-8d7f-57d59abbb6ee _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:10:46 np0005554845 nova_compute[187128]: 2025-12-11 06:10:46.658 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Successfully created port: 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:10:48 np0005554845 podman[218426]: 2025-12-11 06:10:48.134076977 +0000 UTC m=+0.072802994 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.205 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.206 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 WARNING nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.207 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 WARNING nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.208 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.209 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.209 187132 DEBUG oslo_concurrency.lockutils [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.209 187132 DEBUG nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.209 187132 WARNING nova.compute.manager [req-60dadcf1-692d-4b60-9132-9885113254fc req-ef28c030-8523-4110-8089-69e48d43a54b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:48 np0005554845 nova_compute[187128]: 2025-12-11 06:10:48.466 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Successfully updated port: 0c0854bd-fd12-4869-8d7f-57d59abbb6ee _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:10:49 np0005554845 nova_compute[187128]: 2025-12-11 06:10:49.580 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.179 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.293 187132 DEBUG nova.compute.manager [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.294 187132 DEBUG nova.compute.manager [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing instance network info cache due to event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.295 187132 DEBUG oslo_concurrency.lockutils [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.295 187132 DEBUG oslo_concurrency.lockutils [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.296 187132 DEBUG nova.network.neutron [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing network info cache for port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.547 187132 DEBUG nova.network.neutron [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.577 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Successfully updated port: 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.602 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.957 187132 DEBUG nova.network.neutron [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.978 187132 DEBUG oslo_concurrency.lockutils [req-f2d2ee77-200d-4fef-9ba5-41b2f1bcc648 req-c03311c9-8b53-4d13-9dcd-a634b21d37ca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.979 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:50 np0005554845 nova_compute[187128]: 2025-12-11 06:10:50.979 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:10:51 np0005554845 podman[218447]: 2025-12-11 06:10:51.146342663 +0000 UTC m=+0.080165245 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:10:51 np0005554845 podman[218448]: 2025-12-11 06:10:51.187941856 +0000 UTC m=+0.108468995 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:10:51 np0005554845 nova_compute[187128]: 2025-12-11 06:10:51.427 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:10:52 np0005554845 nova_compute[187128]: 2025-12-11 06:10:52.374 187132 DEBUG nova.compute.manager [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-changed-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:52 np0005554845 nova_compute[187128]: 2025-12-11 06:10:52.375 187132 DEBUG nova.compute.manager [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing instance network info cache due to event network-changed-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:10:52 np0005554845 nova_compute[187128]: 2025-12-11 06:10:52.376 187132 DEBUG oslo_concurrency.lockutils [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:10:53 np0005554845 podman[218492]: 2025-12-11 06:10:53.162299762 +0000 UTC m=+0.097741013 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202)
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.333 187132 DEBUG nova.network.neutron [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.583 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.696 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.697 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance network_info: |[{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.698 187132 DEBUG oslo_concurrency.lockutils [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.699 187132 DEBUG nova.network.neutron [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing network info cache for port 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.706 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Start _get_guest_xml network_info=[{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.713 187132 WARNING nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.724 187132 DEBUG nova.virt.libvirt.host [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.725 187132 DEBUG nova.virt.libvirt.host [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.728 187132 DEBUG nova.virt.libvirt.host [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.729 187132 DEBUG nova.virt.libvirt.host [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.731 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.731 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.732 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.732 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.733 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.733 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.734 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.734 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.735 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.735 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.736 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.736 187132 DEBUG nova.virt.hardware [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.742 187132 DEBUG nova.virt.libvirt.vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:10:43Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.743 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.744 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.746 187132 DEBUG nova.virt.libvirt.vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:10:43Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.747 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.748 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.750 187132 DEBUG nova.objects.instance [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 12be6919-2546-4cdc-9e86-d73c99aaad0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.912 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <uuid>12be6919-2546-4cdc-9e86-d73c99aaad0c</uuid>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <name>instance-00000018</name>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-1058948979</nova:name>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:10:54</nova:creationTime>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:port uuid="0c0854bd-fd12-4869-8d7f-57d59abbb6ee">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        <nova:port uuid="7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fecd:e337" ipVersion="6"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fecd:e337" ipVersion="6"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="serial">12be6919-2546-4cdc-9e86-d73c99aaad0c</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="uuid">12be6919-2546-4cdc-9e86-d73c99aaad0c</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.config"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:c3:da:05"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <target dev="tap0c0854bd-fd"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:cd:e3:37"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <target dev="tap7ef553a1-7d"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/console.log" append="off"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:10:54 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:10:54 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:10:54 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:10:54 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.914 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Preparing to wait for external event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.915 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.915 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.916 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.916 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Preparing to wait for external event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.917 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.917 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.917 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.919 187132 DEBUG nova.virt.libvirt.vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:10:43Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.919 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.920 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.921 187132 DEBUG os_vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.922 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.923 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.923 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.927 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.927 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0c0854bd-fd, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.928 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0c0854bd-fd, col_values=(('external_ids', {'iface-id': '0c0854bd-fd12-4869-8d7f-57d59abbb6ee', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c3:da:05', 'vm-uuid': '12be6919-2546-4cdc-9e86-d73c99aaad0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.971 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 NetworkManager[55529]: <info>  [1765433454.9723] manager: (tap0c0854bd-fd): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/81)
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.974 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.982 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.983 187132 INFO os_vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd')#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.984 187132 DEBUG nova.virt.libvirt.vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:10:43Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.985 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.986 187132 DEBUG nova.network.os_vif_util [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.987 187132 DEBUG os_vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.988 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.988 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.989 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.992 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.993 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7ef553a1-7d, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.993 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7ef553a1-7d, col_values=(('external_ids', {'iface-id': '7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:cd:e3:37', 'vm-uuid': '12be6919-2546-4cdc-9e86-d73c99aaad0c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.995 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:54 np0005554845 NetworkManager[55529]: <info>  [1765433454.9968] manager: (tap7ef553a1-7d): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/82)
Dec 11 01:10:54 np0005554845 nova_compute[187128]: 2025-12-11 06:10:54.998 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.007 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.008 187132 INFO os_vif [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d')#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.110 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.111 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.111 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:c3:da:05, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.112 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:cd:e3:37, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.112 187132 INFO nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Using config drive#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.181 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.542 187132 INFO nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Creating config drive at /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.config#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.546 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphub0zma8 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.674 187132 DEBUG oslo_concurrency.processutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphub0zma8" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:10:55 np0005554845 kernel: tap0c0854bd-fd: entered promiscuous mode
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.7515] manager: (tap0c0854bd-fd): new Tun device (/org/freedesktop/NetworkManager/Devices/83)
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.755 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00165|binding|INFO|Claiming lport 0c0854bd-fd12-4869-8d7f-57d59abbb6ee for this chassis.
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00166|binding|INFO|0c0854bd-fd12-4869-8d7f-57d59abbb6ee: Claiming fa:16:3e:c3:da:05 10.100.0.3
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.768 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:da:05 10.100.0.3'], port_security=['fa:16:3e:c3:da:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '12be6919-2546-4cdc-9e86-d73c99aaad0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c525079-021c-44d0-899c-c53f6754298b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1eaefc9c-20e5-44b9-a7ae-fdc4da347c45', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc1922da-04bc-4a19-818b-e4483fd46b40, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0c0854bd-fd12-4869-8d7f-57d59abbb6ee) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.769 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee in datapath 8c525079-021c-44d0-899c-c53f6754298b bound to our chassis#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.771 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 8c525079-021c-44d0-899c-c53f6754298b#033[00m
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00167|binding|INFO|Setting lport 0c0854bd-fd12-4869-8d7f-57d59abbb6ee ovn-installed in OVS
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00168|binding|INFO|Setting lport 0c0854bd-fd12-4869-8d7f-57d59abbb6ee up in Southbound
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.7807] manager: (tap7ef553a1-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/84)
Dec 11 01:10:55 np0005554845 kernel: tap7ef553a1-7d: entered promiscuous mode
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.781 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00169|binding|INFO|Claiming lport 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b for this chassis.
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00170|binding|INFO|7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b: Claiming fa:16:3e:cd:e3:37 2001:db8:0:1:f816:3eff:fecd:e337 2001:db8::f816:3eff:fecd:e337
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.788 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5e9b6168-a3b7-4d58-a969-c98f0ab7d9f2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.789 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap8c525079-01 in ovnmeta-8c525079-021c-44d0-899c-c53f6754298b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.791 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:e3:37 2001:db8:0:1:f816:3eff:fecd:e337 2001:db8::f816:3eff:fecd:e337'], port_security=['fa:16:3e:cd:e3:37 2001:db8:0:1:f816:3eff:fecd:e337 2001:db8::f816:3eff:fecd:e337'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fecd:e337/64 2001:db8::f816:3eff:fecd:e337/64', 'neutron:device_id': '12be6919-2546-4cdc-9e86-d73c99aaad0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1eaefc9c-20e5-44b9-a7ae-fdc4da347c45', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70995ac2-b26f-4427-9e41-8f354c5ed362, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.798 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.800 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap8c525079-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.801 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4c51371f-a475-41e0-b498-3f7128211ff6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.803 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[046c77ea-987b-4557-90df-c5a105dcf698]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00171|binding|INFO|Setting lport 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b ovn-installed in OVS
Dec 11 01:10:55 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:55Z|00172|binding|INFO|Setting lport 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b up in Southbound
Dec 11 01:10:55 np0005554845 systemd-udevd[218537]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:10:55 np0005554845 systemd-udevd[218538]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.809 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.818 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[11561715-9b60-4fd8-9943-39a99955184b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.8223] device (tap7ef553a1-7d): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.8235] device (tap7ef553a1-7d): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.8269] device (tap0c0854bd-fd): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.8280] device (tap0c0854bd-fd): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:10:55 np0005554845 systemd-machined[153381]: New machine qemu-12-instance-00000018.
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.839 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4ece9a6c-e778-4e50-8aa4-96d268a40a7f]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 systemd[1]: Started Virtual Machine qemu-12-instance-00000018.
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.870 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f51aefe5-7d64-49dd-b160-ea5603e91430]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.876 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a14a6e89-cc17-4a9f-b1dc-e25cffb80725]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.8772] manager: (tap8c525079-00): new Veth device (/org/freedesktop/NetworkManager/Devices/85)
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.906 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[96ca570c-ceb9-41a2-b12c-51ef795b2030]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.909 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[691e805f-0ea8-45b5-9f5e-06256f3dd4f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 NetworkManager[55529]: <info>  [1765433455.9301] device (tap8c525079-00): carrier: link connected
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.934 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[146f2af7-c4ed-4c30-93e4-af157b432de6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.949 187132 DEBUG nova.compute.manager [req-de9670cc-9405-4ed1-aa70-bccdb2ecb0cc req-c6bde529-b07e-4c97-a872-a8bebdd93690 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.950 187132 DEBUG oslo_concurrency.lockutils [req-de9670cc-9405-4ed1-aa70-bccdb2ecb0cc req-c6bde529-b07e-4c97-a872-a8bebdd93690 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.950 187132 DEBUG oslo_concurrency.lockutils [req-de9670cc-9405-4ed1-aa70-bccdb2ecb0cc req-c6bde529-b07e-4c97-a872-a8bebdd93690 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.950 187132 DEBUG oslo_concurrency.lockutils [req-de9670cc-9405-4ed1-aa70-bccdb2ecb0cc req-c6bde529-b07e-4c97-a872-a8bebdd93690 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:55 np0005554845 nova_compute[187128]: 2025-12-11 06:10:55.950 187132 DEBUG nova.compute.manager [req-de9670cc-9405-4ed1-aa70-bccdb2ecb0cc req-c6bde529-b07e-4c97-a872-a8bebdd93690 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Processing event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.952 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[29e3430d-abc2-407b-8a9e-dbbec255eeff]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c525079-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:b1:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373057, 'reachable_time': 30521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218573, 'error': None, 'target': 'ovnmeta-8c525079-021c-44d0-899c-c53f6754298b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.964 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b8524de9-2f95-4485-b662-dc1e7c7c5586]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feae:b1f6'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373057, 'tstamp': 373057}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218574, 'error': None, 'target': 'ovnmeta-8c525079-021c-44d0-899c-c53f6754298b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:55.979 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1cbe5747-08c7-45c2-929c-713f73d028a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap8c525079-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ae:b1:f6'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 49], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373057, 'reachable_time': 30521, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218575, 'error': None, 'target': 'ovnmeta-8c525079-021c-44d0-899c-c53f6754298b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.010 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8353d279-18bd-4b41-b417-e61c99e509c9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.083 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[08112c6e-c79d-47e5-b1e8-f9479984899d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.088 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c525079-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.089 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.089 187132 DEBUG nova.compute.manager [req-81ef66d0-90db-4a07-8945-1fbbb88a0c01 req-09672f54-db0a-4e5e-a292-31dff42bb1a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.089 187132 DEBUG oslo_concurrency.lockutils [req-81ef66d0-90db-4a07-8945-1fbbb88a0c01 req-09672f54-db0a-4e5e-a292-31dff42bb1a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.089 187132 DEBUG oslo_concurrency.lockutils [req-81ef66d0-90db-4a07-8945-1fbbb88a0c01 req-09672f54-db0a-4e5e-a292-31dff42bb1a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.089 187132 DEBUG oslo_concurrency.lockutils [req-81ef66d0-90db-4a07-8945-1fbbb88a0c01 req-09672f54-db0a-4e5e-a292-31dff42bb1a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.090 187132 DEBUG nova.compute.manager [req-81ef66d0-90db-4a07-8945-1fbbb88a0c01 req-09672f54-db0a-4e5e-a292-31dff42bb1a3 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Processing event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.090 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap8c525079-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:56 np0005554845 NetworkManager[55529]: <info>  [1765433456.1465] manager: (tap8c525079-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/86)
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.145 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:56 np0005554845 kernel: tap8c525079-00: entered promiscuous mode
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.150 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.152 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap8c525079-00, col_values=(('external_ids', {'iface-id': '4ebe56a3-6669-4fce-bcbc-ee948f3aebd8'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.153 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:56 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:56Z|00173|binding|INFO|Releasing lport 4ebe56a3-6669-4fce-bcbc-ee948f3aebd8 from this chassis (sb_readonly=0)
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.179 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.180 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/8c525079-021c-44d0-899c-c53f6754298b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/8c525079-021c-44d0-899c-c53f6754298b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.181 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[618a3d72-ab2b-4584-a32d-fde578360eca]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.182 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-8c525079-021c-44d0-899c-c53f6754298b
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/8c525079-021c-44d0-899c-c53f6754298b.pid.haproxy
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 8c525079-021c-44d0-899c-c53f6754298b
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.184 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-8c525079-021c-44d0-899c-c53f6754298b', 'env', 'PROCESS_TAG=haproxy-8c525079-021c-44d0-899c-c53f6754298b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/8c525079-021c-44d0-899c-c53f6754298b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.557 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance event wait completed in 0 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.558 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433456.5565104, 12be6919-2546-4cdc-9e86-d73c99aaad0c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.558 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] VM Started (Lifecycle Event)#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.561 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.565 187132 INFO nova.virt.libvirt.driver [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance spawned successfully.#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.565 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:10:56 np0005554845 podman[218617]: 2025-12-11 06:10:56.577565783 +0000 UTC m=+0.060833078 container create 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.592 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.598 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.602 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.603 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.603 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.604 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.605 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.605 187132 DEBUG nova.virt.libvirt.driver [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:10:56 np0005554845 systemd[1]: Started libpod-conmon-8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5.scope.
Dec 11 01:10:56 np0005554845 podman[218617]: 2025-12-11 06:10:56.547570146 +0000 UTC m=+0.030837461 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.642 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.643 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433456.5576627, 12be6919-2546-4cdc-9e86-d73c99aaad0c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.643 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:10:56 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:10:56 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/bc5f23a0de045ddff041f5be5fc61b942d167b15c30b7d7296047daee7fa5734/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:10:56 np0005554845 podman[218617]: 2025-12-11 06:10:56.668430698 +0000 UTC m=+0.151698003 container init 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 11 01:10:56 np0005554845 podman[218617]: 2025-12-11 06:10:56.675575443 +0000 UTC m=+0.158842758 container start 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.677 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.682 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433456.5674148, 12be6919-2546-4cdc-9e86-d73c99aaad0c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.682 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.695 187132 INFO nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Took 13.23 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.696 187132 DEBUG nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.703 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.715 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:10:56 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [NOTICE]   (218636) : New worker (218638) forked
Dec 11 01:10:56 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [NOTICE]   (218636) : Loading success.
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.730 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b in datapath 1e539a2e-efc5-4d88-a649-84787d0021ea unbound from our chassis#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.733 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1e539a2e-efc5-4d88-a649-84787d0021ea#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.745 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7e56cf87-ad70-4029-bf83-0747577c2334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.746 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1e539a2e-e1 in ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.748 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.748 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1e539a2e-e0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.748 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6370d7-7463-40c0-84b3-2536c3c02989]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.749 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[300c9b9e-7c55-459a-bfc6-3d3d56d82f2c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.760 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[9ef2b07e-dee6-4bb7-9fc6-2c3c95e15b94]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.769 187132 INFO nova.compute.manager [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Took 13.73 seconds to build instance.#033[00m
Dec 11 01:10:56 np0005554845 nova_compute[187128]: 2025-12-11 06:10:56.784 187132 DEBUG oslo_concurrency.lockutils [None req-ec0ab0b1-129b-4e4a-8690-494deb3ebd7f 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 13.877s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.786 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[aa6dcc22-bef3-450b-9b2e-cf3a2417b795]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.813 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[133de4ca-772d-4556-b1f7-70988c4062f4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 NetworkManager[55529]: <info>  [1765433456.8200] manager: (tap1e539a2e-e0): new Veth device (/org/freedesktop/NetworkManager/Devices/87)
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.818 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ddf30d07-cfd2-499d-b404-6a1eb85be287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 systemd-udevd[218564]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.851 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[61188bce-ac8a-4348-9b0a-02c609e4a4b8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.858 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[3d9926ec-b1ab-4af1-9646-3eb3d46a9ab3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 NetworkManager[55529]: <info>  [1765433456.8893] device (tap1e539a2e-e0): carrier: link connected
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.893 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[54665b87-5a88-4074-b009-51fd1d17bad4]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.917 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[046adeda-2c4c-41cc-b255-74b93bb46ecd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e539a2e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:b0:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373153, 'reachable_time': 25862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218657, 'error': None, 'target': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.935 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[07a692f0-6e69-462f-9256-ccc4cc6e905e]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea8:b031'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 373153, 'tstamp': 373153}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 218658, 'error': None, 'target': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.957 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3255715c-7933-4356-9464-bcf5700e873a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1e539a2e-e1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a8:b0:31'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 50], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373153, 'reachable_time': 25862, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 218659, 'error': None, 'target': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:56.999 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fcdd91c8-b900-48c5-bac0-b21b45a09bb2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.042 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5cab3888-8328-49b6-80e3-6be5d0b78a46]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.044 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e539a2e-e0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.044 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.044 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1e539a2e-e0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:57 np0005554845 NetworkManager[55529]: <info>  [1765433457.0465] manager: (tap1e539a2e-e0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/88)
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.045 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:57 np0005554845 kernel: tap1e539a2e-e0: entered promiscuous mode
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.049 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.051 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1e539a2e-e0, col_values=(('external_ids', {'iface-id': '3872e131-c169-4394-ac83-4609db001ee7'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.052 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:57 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:57Z|00174|binding|INFO|Releasing lport 3872e131-c169-4394-ac83-4609db001ee7 from this chassis (sb_readonly=0)
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.069 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.072 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1e539a2e-efc5-4d88-a649-84787d0021ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1e539a2e-efc5-4d88-a649-84787d0021ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.074 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[26d89717-d5fb-46eb-9bec-ee92356f0257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.074 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-1e539a2e-efc5-4d88-a649-84787d0021ea
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/1e539a2e-efc5-4d88-a649-84787d0021ea.pid.haproxy
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 1e539a2e-efc5-4d88-a649-84787d0021ea
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:10:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:10:57.075 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'env', 'PROCESS_TAG=haproxy-1e539a2e-efc5-4d88-a649-84787d0021ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1e539a2e-efc5-4d88-a649-84787d0021ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.364 187132 DEBUG nova.network.neutron [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updated VIF entry in instance network info cache for port 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.364 187132 DEBUG nova.network.neutron [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:10:57 np0005554845 nova_compute[187128]: 2025-12-11 06:10:57.384 187132 DEBUG oslo_concurrency.lockutils [req-387edf91-61ed-4723-917e-22b505c25a66 req-1e227762-821a-422d-90a1-11b01096bac7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:10:57 np0005554845 podman[218690]: 2025-12-11 06:10:57.496744709 +0000 UTC m=+0.061497516 container create cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3)
Dec 11 01:10:57 np0005554845 podman[218690]: 2025-12-11 06:10:57.459251247 +0000 UTC m=+0.024004124 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:10:57 np0005554845 systemd[1]: Started libpod-conmon-cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12.scope.
Dec 11 01:10:57 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:10:57 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/6f2c9d63339d0f54b19a4e8931fab0f5d93061b23432208acd9ce9ede80ad9fd/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:10:57 np0005554845 podman[218690]: 2025-12-11 06:10:57.614550587 +0000 UTC m=+0.179303364 container init cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:10:57 np0005554845 podman[218704]: 2025-12-11 06:10:57.615814002 +0000 UTC m=+0.073660917 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:10:57 np0005554845 podman[218705]: 2025-12-11 06:10:57.622977907 +0000 UTC m=+0.076022882 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, io.openshift.tags=minimal rhel9, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-type=git, vendor=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, maintainer=Red Hat, Inc., io.openshift.expose-services=, container_name=openstack_network_exporter, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, distribution-scope=public, io.buildah.version=1.33.7)
Dec 11 01:10:57 np0005554845 podman[218690]: 2025-12-11 06:10:57.626827482 +0000 UTC m=+0.191580249 container start cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 01:10:57 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [NOTICE]   (218754) : New worker (218756) forked
Dec 11 01:10:57 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [NOTICE]   (218754) : Loading success.
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.035 187132 DEBUG nova.compute.manager [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.036 187132 DEBUG oslo_concurrency.lockutils [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.036 187132 DEBUG oslo_concurrency.lockutils [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.036 187132 DEBUG oslo_concurrency.lockutils [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.037 187132 DEBUG nova.compute.manager [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.037 187132 WARNING nova.compute.manager [req-910598ba-d4de-41ba-ab8b-eae290979805 req-8176c13e-7517-449e-94dc-5c6530d65842 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received unexpected event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.366 187132 DEBUG nova.compute.manager [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.367 187132 DEBUG oslo_concurrency.lockutils [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.367 187132 DEBUG oslo_concurrency.lockutils [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.367 187132 DEBUG oslo_concurrency.lockutils [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.368 187132 DEBUG nova.compute.manager [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:10:58 np0005554845 nova_compute[187128]: 2025-12-11 06:10:58.368 187132 WARNING nova.compute.manager [req-5dc9a124-bcbe-4862-85c5-3e7276bd5734 req-17f9eeab-caee-4914-90a5-ae20f18a05db eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received unexpected event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b for instance with vm_state active and task_state None.#033[00m
Dec 11 01:10:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:10:58Z|00021|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:ed:3d:67 10.100.0.7
Dec 11 01:10:59 np0005554845 nova_compute[187128]: 2025-12-11 06:10:59.996 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.186 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.979 187132 DEBUG nova.compute.manager [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.980 187132 DEBUG nova.compute.manager [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing instance network info cache due to event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.980 187132 DEBUG oslo_concurrency.lockutils [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.980 187132 DEBUG oslo_concurrency.lockutils [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:00 np0005554845 nova_compute[187128]: 2025-12-11 06:11:00.981 187132 DEBUG nova.network.neutron [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing network info cache for port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:11:02 np0005554845 nova_compute[187128]: 2025-12-11 06:11:02.515 187132 DEBUG nova.network.neutron [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updated VIF entry in instance network info cache for port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:11:02 np0005554845 nova_compute[187128]: 2025-12-11 06:11:02.515 187132 DEBUG nova.network.neutron [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:02 np0005554845 nova_compute[187128]: 2025-12-11 06:11:02.540 187132 DEBUG oslo_concurrency.lockutils [req-41322a11-705a-429c-b335-f9d31207c8c0 req-256bf375-d3e5-41bc-afbf-55ad2dd4fa6a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:04 np0005554845 nova_compute[187128]: 2025-12-11 06:11:04.999 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:05 np0005554845 nova_compute[187128]: 2025-12-11 06:11:05.195 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:05 np0005554845 nova_compute[187128]: 2025-12-11 06:11:05.238 187132 INFO nova.compute.manager [None req-36ebf374-a72f-4978-9915-55d5f346508b 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Get console output#033[00m
Dec 11 01:11:05 np0005554845 nova_compute[187128]: 2025-12-11 06:11:05.244 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:11:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:08Z|00022|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c3:da:05 10.100.0.3
Dec 11 01:11:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:08Z|00023|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c3:da:05 10.100.0.3
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.845 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.846 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.846 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.846 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.847 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.848 187132 INFO nova.compute.manager [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Terminating instance#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.848 187132 DEBUG nova.compute.manager [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:11:08 np0005554845 kernel: tap4548537f-64 (unregistering): left promiscuous mode
Dec 11 01:11:08 np0005554845 NetworkManager[55529]: <info>  [1765433468.8752] device (tap4548537f-64): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:11:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:08Z|00175|binding|INFO|Releasing lport 4548537f-6484-4703-a9a0-4975e2aa784b from this chassis (sb_readonly=0)
Dec 11 01:11:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:08Z|00176|binding|INFO|Setting lport 4548537f-6484-4703-a9a0-4975e2aa784b down in Southbound
Dec 11 01:11:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:08Z|00177|binding|INFO|Removing iface tap4548537f-64 ovn-installed in OVS
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.933 187132 DEBUG nova.compute.manager [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.934 187132 DEBUG nova.compute.manager [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing instance network info cache due to event network-changed-4548537f-6484-4703-a9a0-4975e2aa784b. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.934 187132 DEBUG oslo_concurrency.lockutils [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.934 187132 DEBUG oslo_concurrency.lockutils [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.934 187132 DEBUG nova.network.neutron [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Refreshing network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.936 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.938 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:08 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:08.941 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:ed:3d:67 10.100.0.7'], port_security=['fa:16:3e:ed:3d:67 10.100.0.7'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.7/28', 'neutron:device_id': '182af6cf-b56e-4c6a-aeb5-092944f1745a', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5914354e-3ed3-47fd-a912-9c7227988a8d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'b2a27f29-0456-4cbd-bd3d-dddfa9586d24', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=a2caf243-c0ee-48b7-bf4e-5d1ee61c0e28, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4548537f-6484-4703-a9a0-4975e2aa784b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:11:08 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:08.943 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4548537f-6484-4703-a9a0-4975e2aa784b in datapath 5914354e-3ed3-47fd-a912-9c7227988a8d unbound from our chassis#033[00m
Dec 11 01:11:08 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:08.945 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5914354e-3ed3-47fd-a912-9c7227988a8d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:11:08 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:08.949 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8357f265-d8b2-4771-9328-11699ffa7f7c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:08 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:08.949 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d namespace which is not needed anymore#033[00m
Dec 11 01:11:08 np0005554845 nova_compute[187128]: 2025-12-11 06:11:08.963 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:08 np0005554845 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Deactivated successfully.
Dec 11 01:11:08 np0005554845 systemd[1]: machine-qemu\x2d11\x2dinstance\x2d00000016.scope: Consumed 13.285s CPU time.
Dec 11 01:11:08 np0005554845 systemd-machined[153381]: Machine qemu-11-instance-00000016 terminated.
Dec 11 01:11:09 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [NOTICE]   (218414) : haproxy version is 2.8.14-c23fe91
Dec 11 01:11:09 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [NOTICE]   (218414) : path to executable is /usr/sbin/haproxy
Dec 11 01:11:09 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [WARNING]  (218414) : Exiting Master process...
Dec 11 01:11:09 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [ALERT]    (218414) : Current worker (218416) exited with code 143 (Terminated)
Dec 11 01:11:09 np0005554845 neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d[218410]: [WARNING]  (218414) : All workers exited. Exiting... (0)
Dec 11 01:11:09 np0005554845 systemd[1]: libpod-741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be.scope: Deactivated successfully.
Dec 11 01:11:09 np0005554845 podman[218802]: 2025-12-11 06:11:09.10596421 +0000 UTC m=+0.056001457 container died 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2)
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.110 187132 INFO nova.virt.libvirt.driver [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Instance destroyed successfully.#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.111 187132 DEBUG nova.objects.instance [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid 182af6cf-b56e-4c6a-aeb5-092944f1745a obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.125 187132 DEBUG nova.virt.libvirt.vif [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:09:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1684014384',display_name='tempest-TestNetworkAdvancedServerOps-server-1684014384',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1684014384',id=22,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBJZn2r+4XJ+BXVtMgu99zp7c2YbMyuHNcWOcnaOXRTzY0GtIyqBDnE+K2336Ko+1tdZUpzJeFdbHNec8NIxnOzc6MVsnUK9kDOH2YZAfybhw/CgYHrjVTBGZLsW2tYlTcQ==',key_name='tempest-TestNetworkAdvancedServerOps-241632790',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:10:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-63kpzom1',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:10:44Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=182af6cf-b56e-4c6a-aeb5-092944f1745a,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.126 187132 DEBUG nova.network.os_vif_util [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.201", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.127 187132 DEBUG nova.network.os_vif_util [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.127 187132 DEBUG os_vif [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.128 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.129 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4548537f-64, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.130 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.132 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.134 187132 INFO os_vif [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:ed:3d:67,bridge_name='br-int',has_traffic_filtering=True,id=4548537f-6484-4703-a9a0-4975e2aa784b,network=Network(5914354e-3ed3-47fd-a912-9c7227988a8d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4548537f-64')#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.134 187132 INFO nova.virt.libvirt.driver [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Deleting instance files /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a_del#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.135 187132 INFO nova.virt.libvirt.driver [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Deletion of /var/lib/nova/instances/182af6cf-b56e-4c6a-aeb5-092944f1745a_del complete#033[00m
Dec 11 01:11:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be-userdata-shm.mount: Deactivated successfully.
Dec 11 01:11:09 np0005554845 systemd[1]: var-lib-containers-storage-overlay-bf707fef1c79443ffb661b819a4aea9cb0fd5754d6dd4abc4e1d22d38ca81ab7-merged.mount: Deactivated successfully.
Dec 11 01:11:09 np0005554845 podman[218802]: 2025-12-11 06:11:09.153170115 +0000 UTC m=+0.103207312 container cleanup 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:11:09 np0005554845 systemd[1]: libpod-conmon-741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be.scope: Deactivated successfully.
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.189 187132 INFO nova.compute.manager [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Took 0.34 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.191 187132 DEBUG oslo.service.loopingcall [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.191 187132 DEBUG nova.compute.manager [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.191 187132 DEBUG nova.network.neutron [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:11:09 np0005554845 podman[218849]: 2025-12-11 06:11:09.227805458 +0000 UTC m=+0.050780015 container remove 741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.233 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7cc36613-d391-4056-aac4-459d8b6f323d]: (4, ('Thu Dec 11 06:11:09 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d (741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be)\n741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be\nThu Dec 11 06:11:09 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d (741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be)\n741ac55f8aa48ebcd784f39aa0fec52be0e4ecd021715d486420e81aa1be62be\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.235 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ef02f6a3-e03d-40ce-ae57-0ac994fb31db]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.237 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5914354e-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.239 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:09 np0005554845 kernel: tap5914354e-30: left promiscuous mode
Dec 11 01:11:09 np0005554845 nova_compute[187128]: 2025-12-11 06:11:09.251 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.255 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c2dab2bf-522e-492a-996c-2cfa5a4f8aae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.269 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fd3ca820-f013-4ce5-b155-8e2f9988f775]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.271 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[79cd15a7-3f9b-422f-a86a-ad8c4b274668]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.291 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[49a57791-1eca-4002-8b4a-f63ff34aa8c9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 371895, 'reachable_time': 24319, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 218864, 'error': None, 'target': 'ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.295 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5914354e-3ed3-47fd-a912-9c7227988a8d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:11:09 np0005554845 systemd[1]: run-netns-ovnmeta\x2d5914354e\x2d3ed3\x2d47fd\x2da912\x2d9c7227988a8d.mount: Deactivated successfully.
Dec 11 01:11:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:09.295 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[d2e632e9-0363-4d2f-b36f-2e5fc3311fa5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.197 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.573 187132 DEBUG nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.574 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.575 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.576 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.576 187132 DEBUG nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.577 187132 DEBUG nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-unplugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.578 187132 DEBUG nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.579 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.579 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.580 187132 DEBUG oslo_concurrency.lockutils [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.581 187132 DEBUG nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] No waiting events found dispatching network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:10 np0005554845 nova_compute[187128]: 2025-12-11 06:11:10.582 187132 WARNING nova.compute.manager [req-e638783e-c923-4bfc-b71f-14b454f4584f req-24b68d7b-0fd3-4ea2-8770-f646baf347fb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received unexpected event network-vif-plugged-4548537f-6484-4703-a9a0-4975e2aa784b for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.373 187132 DEBUG nova.network.neutron [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.400 187132 INFO nova.compute.manager [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Took 2.21 seconds to deallocate network for instance.#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.459 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.459 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.462 187132 DEBUG nova.compute.manager [req-10336a36-89c3-4d42-969b-9c8d47284157 req-aba9529f-292e-4ff5-8c22-02fb8b56397d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Received event network-vif-deleted-4548537f-6484-4703-a9a0-4975e2aa784b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.593 187132 DEBUG nova.compute.provider_tree [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.609 187132 DEBUG nova.scheduler.client.report [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.632 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.662 187132 INFO nova.scheduler.client.report [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocations for instance 182af6cf-b56e-4c6a-aeb5-092944f1745a#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.722 187132 DEBUG oslo_concurrency.lockutils [None req-6f4c8347-1d78-4708-a7de-6ff8ff959eb3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "182af6cf-b56e-4c6a-aeb5-092944f1745a" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.876s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.775 187132 DEBUG nova.network.neutron [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updated VIF entry in instance network info cache for port 4548537f-6484-4703-a9a0-4975e2aa784b. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.776 187132 DEBUG nova.network.neutron [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Updating instance_info_cache with network_info: [{"id": "4548537f-6484-4703-a9a0-4975e2aa784b", "address": "fa:16:3e:ed:3d:67", "network": {"id": "5914354e-3ed3-47fd-a912-9c7227988a8d", "bridge": "br-int", "label": "tempest-network-smoke--34129759", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.7", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4548537f-64", "ovs_interfaceid": "4548537f-6484-4703-a9a0-4975e2aa784b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:11 np0005554845 nova_compute[187128]: 2025-12-11 06:11:11.815 187132 DEBUG oslo_concurrency.lockutils [req-8e1dbf33-0b97-4713-a864-da5cc1419f89 req-9e8a2d75-8898-4403-8494-f77ee321ccad eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-182af6cf-b56e-4c6a-aeb5-092944f1745a" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:13 np0005554845 podman[218865]: 2025-12-11 06:11:13.176267185 +0000 UTC m=+0.090189368 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:11:14 np0005554845 nova_compute[187128]: 2025-12-11 06:11:14.131 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:15 np0005554845 nova_compute[187128]: 2025-12-11 06:11:15.198 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:15 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:15Z|00178|binding|INFO|Releasing lport 3872e131-c169-4394-ac83-4609db001ee7 from this chassis (sb_readonly=0)
Dec 11 01:11:15 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:15Z|00179|binding|INFO|Releasing lport 4ebe56a3-6669-4fce-bcbc-ee948f3aebd8 from this chassis (sb_readonly=0)
Dec 11 01:11:15 np0005554845 nova_compute[187128]: 2025-12-11 06:11:15.893 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.132 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 podman[218891]: 2025-12-11 06:11:19.175800913 +0000 UTC m=+0.102590956 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.314 187132 DEBUG nova.compute.manager [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.315 187132 DEBUG nova.compute.manager [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing instance network info cache due to event network-changed-0c0854bd-fd12-4869-8d7f-57d59abbb6ee. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.316 187132 DEBUG oslo_concurrency.lockutils [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.316 187132 DEBUG oslo_concurrency.lockutils [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.317 187132 DEBUG nova.network.neutron [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Refreshing network info cache for port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.383 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.384 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.384 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.384 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.385 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.386 187132 INFO nova.compute.manager [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Terminating instance#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.386 187132 DEBUG nova.compute.manager [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:11:19 np0005554845 kernel: tap0c0854bd-fd (unregistering): left promiscuous mode
Dec 11 01:11:19 np0005554845 NetworkManager[55529]: <info>  [1765433479.4090] device (tap0c0854bd-fd): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00180|binding|INFO|Releasing lport 0c0854bd-fd12-4869-8d7f-57d59abbb6ee from this chassis (sb_readonly=0)
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00181|binding|INFO|Setting lport 0c0854bd-fd12-4869-8d7f-57d59abbb6ee down in Southbound
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.416 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00182|binding|INFO|Removing iface tap0c0854bd-fd ovn-installed in OVS
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.419 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.428 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c3:da:05 10.100.0.3'], port_security=['fa:16:3e:c3:da:05 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '12be6919-2546-4cdc-9e86-d73c99aaad0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-8c525079-021c-44d0-899c-c53f6754298b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1eaefc9c-20e5-44b9-a7ae-fdc4da347c45', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dc1922da-04bc-4a19-818b-e4483fd46b40, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0c0854bd-fd12-4869-8d7f-57d59abbb6ee) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.429 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee in datapath 8c525079-021c-44d0-899c-c53f6754298b unbound from our chassis#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.431 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 8c525079-021c-44d0-899c-c53f6754298b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.431 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.433 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f6f9842d-d64f-4acc-a99b-af64b419795c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.433 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-8c525079-021c-44d0-899c-c53f6754298b namespace which is not needed anymore#033[00m
Dec 11 01:11:19 np0005554845 kernel: tap7ef553a1-7d (unregistering): left promiscuous mode
Dec 11 01:11:19 np0005554845 NetworkManager[55529]: <info>  [1765433479.4424] device (tap7ef553a1-7d): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.445 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.457 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00183|binding|INFO|Releasing lport 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b from this chassis (sb_readonly=0)
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00184|binding|INFO|Setting lport 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b down in Southbound
Dec 11 01:11:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:19Z|00185|binding|INFO|Removing iface tap7ef553a1-7d ovn-installed in OVS
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.458 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.471 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:cd:e3:37 2001:db8:0:1:f816:3eff:fecd:e337 2001:db8::f816:3eff:fecd:e337'], port_security=['fa:16:3e:cd:e3:37 2001:db8:0:1:f816:3eff:fecd:e337 2001:db8::f816:3eff:fecd:e337'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fecd:e337/64 2001:db8::f816:3eff:fecd:e337/64', 'neutron:device_id': '12be6919-2546-4cdc-9e86-d73c99aaad0c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1e539a2e-efc5-4d88-a649-84787d0021ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1eaefc9c-20e5-44b9-a7ae-fdc4da347c45', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=70995ac2-b26f-4427-9e41-8f354c5ed362, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.471 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Deactivated successfully.
Dec 11 01:11:19 np0005554845 systemd[1]: machine-qemu\x2d12\x2dinstance\x2d00000018.scope: Consumed 14.145s CPU time.
Dec 11 01:11:19 np0005554845 systemd-machined[153381]: Machine qemu-12-instance-00000018 terminated.
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [NOTICE]   (218636) : haproxy version is 2.8.14-c23fe91
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [NOTICE]   (218636) : path to executable is /usr/sbin/haproxy
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [WARNING]  (218636) : Exiting Master process...
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [ALERT]    (218636) : Current worker (218638) exited with code 143 (Terminated)
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b[218632]: [WARNING]  (218636) : All workers exited. Exiting... (0)
Dec 11 01:11:19 np0005554845 systemd[1]: libpod-8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5.scope: Deactivated successfully.
Dec 11 01:11:19 np0005554845 podman[218943]: 2025-12-11 06:11:19.579643382 +0000 UTC m=+0.050040825 container died 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:11:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5-userdata-shm.mount: Deactivated successfully.
Dec 11 01:11:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay-bc5f23a0de045ddff041f5be5fc61b942d167b15c30b7d7296047daee7fa5734-merged.mount: Deactivated successfully.
Dec 11 01:11:19 np0005554845 podman[218943]: 2025-12-11 06:11:19.617861022 +0000 UTC m=+0.088258465 container cleanup 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:11:19 np0005554845 NetworkManager[55529]: <info>  [1765433479.6246] manager: (tap7ef553a1-7d): new Tun device (/org/freedesktop/NetworkManager/Devices/89)
Dec 11 01:11:19 np0005554845 systemd[1]: libpod-conmon-8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5.scope: Deactivated successfully.
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.663 187132 INFO nova.virt.libvirt.driver [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Instance destroyed successfully.#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.665 187132 DEBUG nova.objects.instance [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid 12be6919-2546-4cdc-9e86-d73c99aaad0c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.679 187132 DEBUG nova.virt.libvirt.vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:10:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:10:56Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.679 187132 DEBUG nova.network.os_vif_util [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.680 187132 DEBUG nova.network.os_vif_util [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.681 187132 DEBUG os_vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.682 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.682 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0c0854bd-fd, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.683 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 podman[218990]: 2025-12-11 06:11:19.685341021 +0000 UTC m=+0.041611874 container remove 8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.686 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.688 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.690 187132 INFO os_vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:c3:da:05,bridge_name='br-int',has_traffic_filtering=True,id=0c0854bd-fd12-4869-8d7f-57d59abbb6ee,network=Network(8c525079-021c-44d0-899c-c53f6754298b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0c0854bd-fd')#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.691 187132 DEBUG nova.virt.libvirt.vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:10:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1058948979',display_name='tempest-TestGettingAddress-server-1058948979',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1058948979',id=24,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBKvxcArbgQcBaVlQhT0YV9SpArEMuTTq9c3yQWkMSmfKLY7Z8WYASOT/Zu1BhCks0QrbKz/edsZHSdUKDdywlzIOsMV6p/y8piZ3LJ97vkO4FbcfalR52ueB56xrs3+/jw==',key_name='tempest-TestGettingAddress-80412355',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:10:56Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-wh8dxzqk',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:10:56Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=12be6919-2546-4cdc-9e86-d73c99aaad0c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.691 187132 DEBUG nova.network.os_vif_util [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.692 187132 DEBUG nova.network.os_vif_util [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.692 187132 DEBUG os_vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.692 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1d816d6c-bc06-4734-8c75-3ebcf644c992]: (4, ('Thu Dec 11 06:11:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b (8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5)\n8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5\nThu Dec 11 06:11:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-8c525079-021c-44d0-899c-c53f6754298b (8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5)\n8f358fa77e1c68dcfb01b80e38823e3c2d4fbea35bd313aa5a2a80ab03c10fe5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.693 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.693 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7ef553a1-7d, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.694 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6b8c4146-daf1-45b9-b4e5-92b645491e12]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.694 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.696 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap8c525079-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:19 np0005554845 kernel: tap8c525079-00: left promiscuous mode
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.700 187132 DEBUG nova.compute.manager [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-unplugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.700 187132 DEBUG oslo_concurrency.lockutils [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.700 187132 DEBUG oslo_concurrency.lockutils [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.700 187132 DEBUG oslo_concurrency.lockutils [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.701 187132 DEBUG nova.compute.manager [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-unplugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.701 187132 DEBUG nova.compute.manager [req-e1f49998-09e0-4dd8-90e6-fc5d16a8633c req-e0a859df-e485-4f36-8f7f-d6b55f8c218f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-unplugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.701 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.702 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7baf5332-10d4-4af7-b6ed-cdacf21dc1f1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.703 187132 INFO os_vif [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:cd:e3:37,bridge_name='br-int',has_traffic_filtering=True,id=7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b,network=Network(1e539a2e-efc5-4d88-a649-84787d0021ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7ef553a1-7d')#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.704 187132 INFO nova.virt.libvirt.driver [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Deleting instance files /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c_del#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.704 187132 INFO nova.virt.libvirt.driver [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Deletion of /var/lib/nova/instances/12be6919-2546-4cdc-9e86-d73c99aaad0c_del complete#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.709 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.717 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[51308613-4ee8-49df-8719-eeb95733bf2a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.719 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[34002c8d-d192-4028-b86e-ead2bfe73790]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.735 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9fab437a-deb6-470d-b37a-8c1b217b4620]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373051, 'reachable_time': 44273, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219020, 'error': None, 'target': 'ovnmeta-8c525079-021c-44d0-899c-c53f6754298b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.739 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-8c525079-021c-44d0-899c-c53f6754298b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.739 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[52435505-bbee-4d27-aa76-15be3768f08b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 systemd[1]: run-netns-ovnmeta\x2d8c525079\x2d021c\x2d44d0\x2d899c\x2dc53f6754298b.mount: Deactivated successfully.
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.741 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b in datapath 1e539a2e-efc5-4d88-a649-84787d0021ea unbound from our chassis#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.743 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1e539a2e-efc5-4d88-a649-84787d0021ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.744 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3176fc4f-5646-4bc4-afba-ce4a1a405cd2]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.745 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea namespace which is not needed anymore#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.767 187132 INFO nova.compute.manager [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Took 0.38 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.768 187132 DEBUG oslo.service.loopingcall [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.768 187132 DEBUG nova.compute.manager [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.768 187132 DEBUG nova.network.neutron [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [NOTICE]   (218754) : haproxy version is 2.8.14-c23fe91
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [NOTICE]   (218754) : path to executable is /usr/sbin/haproxy
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [WARNING]  (218754) : Exiting Master process...
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [ALERT]    (218754) : Current worker (218756) exited with code 143 (Terminated)
Dec 11 01:11:19 np0005554845 neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea[218726]: [WARNING]  (218754) : All workers exited. Exiting... (0)
Dec 11 01:11:19 np0005554845 systemd[1]: libpod-cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12.scope: Deactivated successfully.
Dec 11 01:11:19 np0005554845 podman[219038]: 2025-12-11 06:11:19.880752703 +0000 UTC m=+0.051241846 container died cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 11 01:11:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12-userdata-shm.mount: Deactivated successfully.
Dec 11 01:11:19 np0005554845 systemd[1]: var-lib-containers-storage-overlay-6f2c9d63339d0f54b19a4e8931fab0f5d93061b23432208acd9ce9ede80ad9fd-merged.mount: Deactivated successfully.
Dec 11 01:11:19 np0005554845 podman[219038]: 2025-12-11 06:11:19.912797296 +0000 UTC m=+0.083286399 container cleanup cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:11:19 np0005554845 systemd[1]: libpod-conmon-cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12.scope: Deactivated successfully.
Dec 11 01:11:19 np0005554845 podman[219068]: 2025-12-11 06:11:19.981427945 +0000 UTC m=+0.050841715 container remove cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.988 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2a76966a-7648-40de-b918-45c4ee0cf20f]: (4, ('Thu Dec 11 06:11:19 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea (cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12)\ncdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12\nThu Dec 11 06:11:19 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea (cdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12)\ncdb583fa39f5e3fb8a161dc1275f95dd194aa67ddbff61db9f43cc882f073a12\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.990 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8c6d87bb-a15c-47d5-b168-009167e60c1e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:19.990 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1e539a2e-e0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:19 np0005554845 nova_compute[187128]: 2025-12-11 06:11:19.992 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:19 np0005554845 kernel: tap1e539a2e-e0: left promiscuous mode
Dec 11 01:11:20 np0005554845 nova_compute[187128]: 2025-12-11 06:11:20.003 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:20 np0005554845 nova_compute[187128]: 2025-12-11 06:11:20.004 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.005 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[eae4922a-c804-43f6-8f6a-ea7676ef7b43]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.025 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[70e4a099-9cd0-4289-89d6-c7dfa788d299]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.027 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[29494a91-962c-4f3c-ae2a-29d00a5d3027]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.039 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[90595799-7993-44a4-b6f4-beaa4cf8bbef]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 373145, 'reachable_time': 18272, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219083, 'error': None, 'target': 'ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.041 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1e539a2e-efc5-4d88-a649-84787d0021ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:11:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:20.041 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[9372ef37-ed53-4e30-81f4-7e4d1743e5c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:20 np0005554845 nova_compute[187128]: 2025-12-11 06:11:20.241 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:20 np0005554845 systemd[1]: run-netns-ovnmeta\x2d1e539a2e\x2defc5\x2d4d88\x2da649\x2d84787d0021ea.mount: Deactivated successfully.
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.401 187132 DEBUG nova.network.neutron [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updated VIF entry in instance network info cache for port 0c0854bd-fd12-4869-8d7f-57d59abbb6ee. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.402 187132 DEBUG nova.network.neutron [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [{"id": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "address": "fa:16:3e:c3:da:05", "network": {"id": "8c525079-021c-44d0-899c-c53f6754298b", "bridge": "br-int", "label": "tempest-network-smoke--1138633284", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0c0854bd-fd", "ovs_interfaceid": "0c0854bd-fd12-4869-8d7f-57d59abbb6ee", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "address": "fa:16:3e:cd:e3:37", "network": {"id": "1e539a2e-efc5-4d88-a649-84787d0021ea", "bridge": "br-int", "label": "tempest-network-smoke--757280817", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fecd:e337", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7ef553a1-7d", "ovs_interfaceid": "7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.426 187132 DEBUG oslo_concurrency.lockutils [req-04a65c19-335b-40c8-8ec0-2a8a24c0f62a req-ba7b80b1-99be-447b-9317-8aa68ed8bfeb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-12be6919-2546-4cdc-9e86-d73c99aaad0c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.434 187132 DEBUG nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-unplugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.435 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.435 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.436 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.436 187132 DEBUG nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-unplugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.436 187132 DEBUG nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-unplugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.436 187132 DEBUG nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.437 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.437 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.437 187132 DEBUG oslo_concurrency.lockutils [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.437 187132 DEBUG nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.438 187132 WARNING nova.compute.manager [req-0bf9e6bc-d3bf-4328-8340-a40fecca91e0 req-2e3cbd9b-3a89-48bb-8779-b727592215eb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received unexpected event network-vif-plugged-0c0854bd-fd12-4869-8d7f-57d59abbb6ee for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.783 187132 DEBUG nova.compute.manager [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.784 187132 DEBUG oslo_concurrency.lockutils [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.785 187132 DEBUG oslo_concurrency.lockutils [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.785 187132 DEBUG oslo_concurrency.lockutils [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.786 187132 DEBUG nova.compute.manager [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] No waiting events found dispatching network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.786 187132 WARNING nova.compute.manager [req-3093abe6-b03a-4d58-bd17-0265995bb9ad req-d1c28714-2ffd-457e-a365-f61b2225392a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received unexpected event network-vif-plugged-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.930 187132 DEBUG nova.network.neutron [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.953 187132 INFO nova.compute.manager [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Took 2.18 seconds to deallocate network for instance.#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.988 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:21 np0005554845 nova_compute[187128]: 2025-12-11 06:11:21.988 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.048 187132 DEBUG nova.compute.provider_tree [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.062 187132 DEBUG nova.scheduler.client.report [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.089 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.100s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.121 187132 INFO nova.scheduler.client.report [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance 12be6919-2546-4cdc-9e86-d73c99aaad0c#033[00m
Dec 11 01:11:22 np0005554845 podman[219084]: 2025-12-11 06:11:22.135926259 +0000 UTC m=+0.058587157 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, managed_by=edpm_ansible)
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.185 187132 DEBUG oslo_concurrency.lockutils [None req-58a0a179-ce31-453b-8eb7-76e6ec31d5da 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "12be6919-2546-4cdc-9e86-d73c99aaad0c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.801s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:22 np0005554845 podman[219085]: 2025-12-11 06:11:22.18665564 +0000 UTC m=+0.102194964 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0)
Dec 11 01:11:22 np0005554845 nova_compute[187128]: 2025-12-11 06:11:22.471 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:23 np0005554845 nova_compute[187128]: 2025-12-11 06:11:23.532 187132 DEBUG nova.compute.manager [req-900fce28-4f49-41f1-aa72-16b262a3cebb req-bb137d11-1d94-46c3-ad00-252fa32da7d5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-deleted-0c0854bd-fd12-4869-8d7f-57d59abbb6ee external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:23 np0005554845 nova_compute[187128]: 2025-12-11 06:11:23.532 187132 DEBUG nova.compute.manager [req-900fce28-4f49-41f1-aa72-16b262a3cebb req-bb137d11-1d94-46c3-ad00-252fa32da7d5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Received event network-vif-deleted-7ef553a1-7d4f-4a57-a48b-c4b3385d9a2b external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.109 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433469.107983, 182af6cf-b56e-4c6a-aeb5-092944f1745a => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.109 187132 INFO nova.compute.manager [-] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:11:24 np0005554845 podman[219129]: 2025-12-11 06:11:24.143490069 +0000 UTC m=+0.082192140 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.149 187132 DEBUG nova.compute.manager [None req-ddf42521-f6e1-4086-b49d-32bcdb31c550 - - - - - -] [instance: 182af6cf-b56e-4c6a-aeb5-092944f1745a] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.695 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.716 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.870 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.871 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5682MB free_disk=73.33057022094727GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.872 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.872 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.916 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.916 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.937 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.952 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.973 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:11:24 np0005554845 nova_compute[187128]: 2025-12-11 06:11:24.973 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.244 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.969 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.970 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.970 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.971 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:11:25 np0005554845 nova_compute[187128]: 2025-12-11 06:11:25.986 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:11:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:26.223 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:26.224 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:26.225 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:27 np0005554845 nova_compute[187128]: 2025-12-11 06:11:27.567 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:27 np0005554845 nova_compute[187128]: 2025-12-11 06:11:27.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:27 np0005554845 nova_compute[187128]: 2025-12-11 06:11:27.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:27 np0005554845 nova_compute[187128]: 2025-12-11 06:11:27.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:27 np0005554845 nova_compute[187128]: 2025-12-11 06:11:27.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:11:28 np0005554845 podman[219150]: 2025-12-11 06:11:28.131397068 +0000 UTC m=+0.059779700 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:11:28 np0005554845 podman[219151]: 2025-12-11 06:11:28.13223108 +0000 UTC m=+0.059327707 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, version=9.6, io.buildah.version=1.33.7)
Dec 11 01:11:28 np0005554845 nova_compute[187128]: 2025-12-11 06:11:28.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:29 np0005554845 nova_compute[187128]: 2025-12-11 06:11:29.698 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.098 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.099 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.100 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:11:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.279 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.497 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.497 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.566 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.719 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.720 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.726 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:11:30 np0005554845 nova_compute[187128]: 2025-12-11 06:11:30.727 187132 INFO nova.compute.claims [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.169 187132 DEBUG nova.compute.provider_tree [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.220 187132 DEBUG nova.scheduler.client.report [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.268 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.548s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.269 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.340 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.341 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.371 187132 INFO nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.395 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.587 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.589 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.589 187132 INFO nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating image(s)#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.590 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.590 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.591 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.609 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.666 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.668 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.669 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.698 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.718 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.730 187132 DEBUG nova.policy [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.754 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.756 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.803 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk 1073741824" returned: 0 in 0.047s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.804 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.135s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.805 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.867 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.868 187132 DEBUG nova.virt.disk.api [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.869 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.944 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.945 187132 DEBUG nova.virt.disk.api [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.946 187132 DEBUG nova.objects.instance [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.962 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.963 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Ensure instance console log exists: /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.964 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.965 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:31 np0005554845 nova_compute[187128]: 2025-12-11 06:11:31.965 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:32 np0005554845 nova_compute[187128]: 2025-12-11 06:11:32.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:11:33 np0005554845 nova_compute[187128]: 2025-12-11 06:11:33.455 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Successfully created port: 1f7ef811-f5fc-4537-879a-7227b3f08154 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:11:34 np0005554845 nova_compute[187128]: 2025-12-11 06:11:34.663 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433479.6618989, 12be6919-2546-4cdc-9e86-d73c99aaad0c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:11:34 np0005554845 nova_compute[187128]: 2025-12-11 06:11:34.665 187132 INFO nova.compute.manager [-] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:11:34 np0005554845 nova_compute[187128]: 2025-12-11 06:11:34.685 187132 DEBUG nova.compute.manager [None req-4160596b-17f2-41e5-9960-22170c283bba - - - - - -] [instance: 12be6919-2546-4cdc-9e86-d73c99aaad0c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:34 np0005554845 nova_compute[187128]: 2025-12-11 06:11:34.700 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.029 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Successfully updated port: 1f7ef811-f5fc-4537-879a-7227b3f08154 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.052 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.052 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.053 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.224 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.332 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.372 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.383 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:35 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:35.468 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=10, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=9) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:11:35 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:35.468 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:11:35 np0005554845 nova_compute[187128]: 2025-12-11 06:11:35.470 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.363 187132 DEBUG nova.network.neutron [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.394 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.395 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance network_info: |[{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.398 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start _get_guest_xml network_info=[{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.404 187132 WARNING nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.409 187132 DEBUG nova.virt.libvirt.host [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.410 187132 DEBUG nova.virt.libvirt.host [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.413 187132 DEBUG nova.virt.libvirt.host [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.414 187132 DEBUG nova.virt.libvirt.host [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.416 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.416 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.417 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.417 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.418 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.418 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.418 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.419 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.419 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.419 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.420 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.420 187132 DEBUG nova.virt.hardware [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.425 187132 DEBUG nova.virt.libvirt.vif [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:11:31Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.426 187132 DEBUG nova.network.os_vif_util [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.427 187132 DEBUG nova.network.os_vif_util [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.430 187132 DEBUG nova.objects.instance [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.441 187132 DEBUG nova.compute.manager [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.442 187132 DEBUG nova.compute.manager [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing instance network info cache due to event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.442 187132 DEBUG oslo_concurrency.lockutils [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.443 187132 DEBUG oslo_concurrency.lockutils [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.443 187132 DEBUG nova.network.neutron [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.447 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <uuid>e5327266-05a9-47da-91fc-d5cd8866fa3e</uuid>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <name>instance-0000001a</name>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1449027617</nova:name>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:11:36</nova:creationTime>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        <nova:port uuid="1f7ef811-f5fc-4537-879a-7227b3f08154">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="serial">e5327266-05a9-47da-91fc-d5cd8866fa3e</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="uuid">e5327266-05a9-47da-91fc-d5cd8866fa3e</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:f2:99:40"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <target dev="tap1f7ef811-f5"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/console.log" append="off"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:11:36 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:11:36 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:11:36 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:11:36 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.448 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Preparing to wait for external event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.448 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.449 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.449 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.450 187132 DEBUG nova.virt.libvirt.vif [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:11:31Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.451 187132 DEBUG nova.network.os_vif_util [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.451 187132 DEBUG nova.network.os_vif_util [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.452 187132 DEBUG os_vif [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.453 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.453 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.457 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.458 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f7ef811-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.459 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f7ef811-f5, col_values=(('external_ids', {'iface-id': '1f7ef811-f5fc-4537-879a-7227b3f08154', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:99:40', 'vm-uuid': 'e5327266-05a9-47da-91fc-d5cd8866fa3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.460 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:36 np0005554845 NetworkManager[55529]: <info>  [1765433496.4620] manager: (tap1f7ef811-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/90)
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.462 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.466 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.467 187132 INFO os_vif [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5')#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.515 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.516 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.516 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:f2:99:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:11:36 np0005554845 nova_compute[187128]: 2025-12-11 06:11:36.516 187132 INFO nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Using config drive#033[00m
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.425 187132 INFO nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating config drive at /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config#033[00m
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.431 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmfpvr9wq execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.561 187132 DEBUG oslo_concurrency.processutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmfpvr9wq" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:11:38 np0005554845 kernel: tap1f7ef811-f5: entered promiscuous mode
Dec 11 01:11:38 np0005554845 NetworkManager[55529]: <info>  [1765433498.6376] manager: (tap1f7ef811-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/91)
Dec 11 01:11:38 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:38Z|00186|binding|INFO|Claiming lport 1f7ef811-f5fc-4537-879a-7227b3f08154 for this chassis.
Dec 11 01:11:38 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:38Z|00187|binding|INFO|1f7ef811-f5fc-4537-879a-7227b3f08154: Claiming fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.639 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.643 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:38 np0005554845 systemd-udevd[219229]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:11:38 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:38Z|00188|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 ovn-installed in OVS
Dec 11 01:11:38 np0005554845 NetworkManager[55529]: <info>  [1765433498.7433] device (tap1f7ef811-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:11:38 np0005554845 NetworkManager[55529]: <info>  [1765433498.7443] device (tap1f7ef811-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:11:38 np0005554845 nova_compute[187128]: 2025-12-11 06:11:38.744 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:38 np0005554845 systemd-machined[153381]: New machine qemu-13-instance-0000001a.
Dec 11 01:11:38 np0005554845 systemd[1]: Started Virtual Machine qemu-13-instance-0000001a.
Dec 11 01:11:38 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:38Z|00189|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 up in Southbound
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.845 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:99:40 10.100.0.6'], port_security=['fa:16:3e:f2:99:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e5327266-05a9-47da-91fc-d5cd8866fa3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1399078c-4ff0-437d-a25b-2a77a741362d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'dc718b16-d267-4010-9f1f-c5510585bdeb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d50d5fec-d774-4d77-a4b2-f05764d3d543, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=1f7ef811-f5fc-4537-879a-7227b3f08154) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.849 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 1f7ef811-f5fc-4537-879a-7227b3f08154 in datapath 1399078c-4ff0-437d-a25b-2a77a741362d bound to our chassis#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.852 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1399078c-4ff0-437d-a25b-2a77a741362d#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.872 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6c67a15b-7210-4795-8cf4-7cc77e61b7d5]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.873 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1399078c-41 in ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.877 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1399078c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.877 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[636995a4-ed25-43a2-87b3-17028501cc0f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.879 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bc5d3c71-5261-47af-8c05-892f855a0867]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.897 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[c5a86150-2291-4948-af10-fc5380bacf81]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.913 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[eb86fa8e-a791-4cf4-91fe-a105c96bd2e1]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.944 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[c2067d4c-8d3f-4069-8a40-690ece4780e7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.950 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4ad237db-f940-4e0b-89b0-9f707e7e0910]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 NetworkManager[55529]: <info>  [1765433498.9515] manager: (tap1399078c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/92)
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.980 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[ff8669dc-d049-45ac-b593-b873e787b7f3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:38.983 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[ba08fd1c-1c5b-4636-9a66-ff103fd17fcf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 NetworkManager[55529]: <info>  [1765433499.0136] device (tap1399078c-40): carrier: link connected
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.022 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[55027726-be9d-43e5-8f0b-09de4337e236]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.045 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5f1604c3-d2a1-4387-a420-a318cbda43e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1399078c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:71:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377365, 'reachable_time': 18084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219269, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.066 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fbf320c6-ba37-4abb-8f56-aa5997986ca7]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:7199'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 377365, 'tstamp': 377365}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219270, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.091 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d273aca4-4fc0-4708-b758-bd352371098b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1399078c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:71:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 55], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377365, 'reachable_time': 18084, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219272, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.103 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433499.1029205, e5327266-05a9-47da-91fc-d5cd8866fa3e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.104 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Started (Lifecycle Event)#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.135 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[be9ff871-44ff-401e-b35e-6214cc9d0580]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.224 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1688ef26-0813-4e6e-81fc-2362c5b6a048]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.226 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1399078c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.226 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.227 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1399078c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:39 np0005554845 NetworkManager[55529]: <info>  [1765433499.2327] manager: (tap1399078c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/93)
Dec 11 01:11:39 np0005554845 kernel: tap1399078c-40: entered promiscuous mode
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.231 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.235 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.235 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1399078c-40, col_values=(('external_ids', {'iface-id': '7bb85b09-e8e0-47cb-9628-df3e8460ffff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:39Z|00190|binding|INFO|Releasing lport 7bb85b09-e8e0-47cb-9628-df3e8460ffff from this chassis (sb_readonly=0)
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.239 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.240 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f0a4d071-c85d-43fd-9d5e-00aa779c66a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.241 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-1399078c-4ff0-437d-a25b-2a77a741362d
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 1399078c-4ff0-437d-a25b-2a77a741362d
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:11:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:39.242 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'env', 'PROCESS_TAG=haproxy-1399078c-4ff0-437d-a25b-2a77a741362d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1399078c-4ff0-437d-a25b-2a77a741362d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.251 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.319 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.325 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433499.1042917, e5327266-05a9-47da-91fc-d5cd8866fa3e => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.326 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:11:39 np0005554845 podman[219304]: 2025-12-11 06:11:39.630736526 +0000 UTC m=+0.062941522 container create de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 11 01:11:39 np0005554845 systemd[1]: Started libpod-conmon-de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b.scope.
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.680 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:39 np0005554845 podman[219304]: 2025-12-11 06:11:39.5901431 +0000 UTC m=+0.022348106 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.685 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:11:39 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:11:39 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/e6ff32ee49076ab179419562afcad2976343e8b09ea42cb76e7b2f96f18bddb4/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:11:39 np0005554845 nova_compute[187128]: 2025-12-11 06:11:39.717 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:11:39 np0005554845 podman[219304]: 2025-12-11 06:11:39.724686059 +0000 UTC m=+0.156891055 container init de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 11 01:11:39 np0005554845 podman[219304]: 2025-12-11 06:11:39.731477786 +0000 UTC m=+0.163682772 container start de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3)
Dec 11 01:11:39 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [NOTICE]   (219324) : New worker (219326) forked
Dec 11 01:11:39 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [NOTICE]   (219324) : Loading success.
Dec 11 01:11:40 np0005554845 nova_compute[187128]: 2025-12-11 06:11:40.376 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.462 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:41 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.816 187132 DEBUG nova.compute.manager [req-3af6da3c-53e9-4017-a8fe-bf9e5b1462bd req-fda22a44-b972-43bc-89aa-f474f0bea9fc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.816 187132 DEBUG oslo_concurrency.lockutils [req-3af6da3c-53e9-4017-a8fe-bf9e5b1462bd req-fda22a44-b972-43bc-89aa-f474f0bea9fc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.817 187132 DEBUG oslo_concurrency.lockutils [req-3af6da3c-53e9-4017-a8fe-bf9e5b1462bd req-fda22a44-b972-43bc-89aa-f474f0bea9fc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.817 187132 DEBUG oslo_concurrency.lockutils [req-3af6da3c-53e9-4017-a8fe-bf9e5b1462bd req-fda22a44-b972-43bc-89aa-f474f0bea9fc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.817 187132 DEBUG nova.compute.manager [req-3af6da3c-53e9-4017-a8fe-bf9e5b1462bd req-fda22a44-b972-43bc-89aa-f474f0bea9fc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Processing event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.818 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance event wait completed in 2 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.823 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433501.8221831, e5327266-05a9-47da-91fc-d5cd8866fa3e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.823 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.825 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.828 187132 INFO nova.virt.libvirt.driver [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance spawned successfully.#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.828 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.863 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.868 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.872 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.873 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.873 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.873 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.874 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.874 187132 DEBUG nova.virt.libvirt.driver [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.937 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.996 187132 INFO nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Took 10.41 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:11:41 np0005554845 nova_compute[187128]: 2025-12-11 06:11:41.996 187132 DEBUG nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:11:42 np0005554845 nova_compute[187128]: 2025-12-11 06:11:42.081 187132 INFO nova.compute.manager [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Took 11.45 seconds to build instance.#033[00m
Dec 11 01:11:42 np0005554845 nova_compute[187128]: 2025-12-11 06:11:42.109 187132 DEBUG oslo_concurrency.lockutils [None req-457587cb-69bf-47f5-975b-b8e575d927ac 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.611s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:42 np0005554845 nova_compute[187128]: 2025-12-11 06:11:42.149 187132 DEBUG nova.network.neutron [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updated VIF entry in instance network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:11:42 np0005554845 nova_compute[187128]: 2025-12-11 06:11:42.150 187132 DEBUG nova.network.neutron [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:42 np0005554845 nova_compute[187128]: 2025-12-11 06:11:42.180 187132 DEBUG oslo_concurrency.lockutils [req-05d1c546-69bf-451a-9437-b7d291bf0b47 req-aced878e-3d49-424a-8d9b-fbe811b36f0d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:44 np0005554845 podman[219336]: 2025-12-11 06:11:44.143505784 +0000 UTC m=+0.062236553 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.178 187132 DEBUG nova.compute.manager [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.178 187132 DEBUG oslo_concurrency.lockutils [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.178 187132 DEBUG oslo_concurrency.lockutils [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.179 187132 DEBUG oslo_concurrency.lockutils [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.179 187132 DEBUG nova.compute.manager [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:11:44 np0005554845 nova_compute[187128]: 2025-12-11 06:11:44.179 187132 WARNING nova.compute.manager [req-d1c55a83-68bb-4612-99e3-42025fe9b929 req-dfaf9a71-b307-41fc-a267-43e987ffe60c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:11:45 np0005554845 nova_compute[187128]: 2025-12-11 06:11:45.380 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:11:45.471 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '10'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:11:46 np0005554845 nova_compute[187128]: 2025-12-11 06:11:46.464 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:47 np0005554845 NetworkManager[55529]: <info>  [1765433507.3409] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/94)
Dec 11 01:11:47 np0005554845 NetworkManager[55529]: <info>  [1765433507.3415] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/95)
Dec 11 01:11:47 np0005554845 nova_compute[187128]: 2025-12-11 06:11:47.342 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:47 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:47Z|00191|binding|INFO|Releasing lport 7bb85b09-e8e0-47cb-9628-df3e8460ffff from this chassis (sb_readonly=0)
Dec 11 01:11:47 np0005554845 nova_compute[187128]: 2025-12-11 06:11:47.427 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:47 np0005554845 nova_compute[187128]: 2025-12-11 06:11:47.435 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:48 np0005554845 nova_compute[187128]: 2025-12-11 06:11:48.470 187132 DEBUG nova.compute.manager [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:11:48 np0005554845 nova_compute[187128]: 2025-12-11 06:11:48.471 187132 DEBUG nova.compute.manager [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing instance network info cache due to event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:11:48 np0005554845 nova_compute[187128]: 2025-12-11 06:11:48.472 187132 DEBUG oslo_concurrency.lockutils [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:11:48 np0005554845 nova_compute[187128]: 2025-12-11 06:11:48.472 187132 DEBUG oslo_concurrency.lockutils [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:11:48 np0005554845 nova_compute[187128]: 2025-12-11 06:11:48.473 187132 DEBUG nova.network.neutron [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:11:50 np0005554845 podman[219361]: 2025-12-11 06:11:50.144428644 +0000 UTC m=+0.077397719 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:11:50 np0005554845 nova_compute[187128]: 2025-12-11 06:11:50.381 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:51 np0005554845 nova_compute[187128]: 2025-12-11 06:11:51.467 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:51 np0005554845 nova_compute[187128]: 2025-12-11 06:11:51.798 187132 DEBUG nova.network.neutron [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updated VIF entry in instance network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:11:51 np0005554845 nova_compute[187128]: 2025-12-11 06:11:51.798 187132 DEBUG nova.network.neutron [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:11:51 np0005554845 nova_compute[187128]: 2025-12-11 06:11:51.844 187132 DEBUG oslo_concurrency.lockutils [req-6d7096b0-1838-42a0-b0e9-fdaffd2c63c0 req-d63b2a08-79a6-4590-97f7-76b2c73d7ca1 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:11:53 np0005554845 podman[219402]: 2025-12-11 06:11:53.149363988 +0000 UTC m=+0.077449161 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:11:53 np0005554845 podman[219403]: 2025-12-11 06:11:53.180912826 +0000 UTC m=+0.106128030 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=ovn_controller)
Dec 11 01:11:53 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:53Z|00024|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:11:53 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:53Z|00025|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:11:54 np0005554845 ovn_controller[95428]: 2025-12-11T06:11:54Z|00192|binding|INFO|Releasing lport 7bb85b09-e8e0-47cb-9628-df3e8460ffff from this chassis (sb_readonly=0)
Dec 11 01:11:54 np0005554845 nova_compute[187128]: 2025-12-11 06:11:54.626 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:55 np0005554845 podman[219446]: 2025-12-11 06:11:55.151977448 +0000 UTC m=+0.069850912 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible)
Dec 11 01:11:55 np0005554845 nova_compute[187128]: 2025-12-11 06:11:55.381 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:56 np0005554845 nova_compute[187128]: 2025-12-11 06:11:56.510 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:57 np0005554845 nova_compute[187128]: 2025-12-11 06:11:57.144 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:57 np0005554845 nova_compute[187128]: 2025-12-11 06:11:57.927 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:11:59 np0005554845 podman[219467]: 2025-12-11 06:11:59.130236086 +0000 UTC m=+0.057532953 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, build-date=2025-08-20T13:12:41, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container)
Dec 11 01:11:59 np0005554845 podman[219466]: 2025-12-11 06:11:59.142192404 +0000 UTC m=+0.065520093 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:12:00 np0005554845 nova_compute[187128]: 2025-12-11 06:12:00.281 187132 INFO nova.compute.manager [None req-be1cfc17-b2e0-45b6-97f2-4d24afeebdc3 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Get console output#033[00m
Dec 11 01:12:00 np0005554845 nova_compute[187128]: 2025-12-11 06:12:00.290 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:12:00 np0005554845 nova_compute[187128]: 2025-12-11 06:12:00.384 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:01 np0005554845 nova_compute[187128]: 2025-12-11 06:12:01.725 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:01 np0005554845 nova_compute[187128]: 2025-12-11 06:12:01.761 187132 INFO nova.compute.manager [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Rebuilding instance#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.304 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'trusted_certs' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.321 187132 DEBUG nova.compute.manager [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.372 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_requests' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.384 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.397 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.410 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.426 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec 11 01:12:02 np0005554845 nova_compute[187128]: 2025-12-11 06:12:02.431 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Shutting down instance from state 1 _clean_shutdown /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4071#033[00m
Dec 11 01:12:04 np0005554845 kernel: tap1f7ef811-f5 (unregistering): left promiscuous mode
Dec 11 01:12:04 np0005554845 NetworkManager[55529]: <info>  [1765433524.6138] device (tap1f7ef811-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:12:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:04Z|00193|binding|INFO|Releasing lport 1f7ef811-f5fc-4537-879a-7227b3f08154 from this chassis (sb_readonly=0)
Dec 11 01:12:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:04Z|00194|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 down in Southbound
Dec 11 01:12:04 np0005554845 nova_compute[187128]: 2025-12-11 06:12:04.617 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:04 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:04Z|00195|binding|INFO|Removing iface tap1f7ef811-f5 ovn-installed in OVS
Dec 11 01:12:04 np0005554845 nova_compute[187128]: 2025-12-11 06:12:04.621 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.626 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:99:40 10.100.0.6'], port_security=['fa:16:3e:f2:99:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e5327266-05a9-47da-91fc-d5cd8866fa3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1399078c-4ff0-437d-a25b-2a77a741362d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dc718b16-d267-4010-9f1f-c5510585bdeb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.216'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d50d5fec-d774-4d77-a4b2-f05764d3d543, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=1f7ef811-f5fc-4537-879a-7227b3f08154) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.627 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 1f7ef811-f5fc-4537-879a-7227b3f08154 in datapath 1399078c-4ff0-437d-a25b-2a77a741362d unbound from our chassis#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.629 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1399078c-4ff0-437d-a25b-2a77a741362d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.631 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e2dd3e7e-6cc1-4538-9111-6d87ec1aee1b]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.631 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d namespace which is not needed anymore#033[00m
Dec 11 01:12:04 np0005554845 nova_compute[187128]: 2025-12-11 06:12:04.641 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:04 np0005554845 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 11 01:12:04 np0005554845 systemd[1]: machine-qemu\x2d13\x2dinstance\x2d0000001a.scope: Consumed 13.093s CPU time.
Dec 11 01:12:04 np0005554845 systemd-machined[153381]: Machine qemu-13-instance-0000001a terminated.
Dec 11 01:12:04 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [NOTICE]   (219324) : haproxy version is 2.8.14-c23fe91
Dec 11 01:12:04 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [NOTICE]   (219324) : path to executable is /usr/sbin/haproxy
Dec 11 01:12:04 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [WARNING]  (219324) : Exiting Master process...
Dec 11 01:12:04 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [ALERT]    (219324) : Current worker (219326) exited with code 143 (Terminated)
Dec 11 01:12:04 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219320]: [WARNING]  (219324) : All workers exited. Exiting... (0)
Dec 11 01:12:04 np0005554845 systemd[1]: libpod-de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b.scope: Deactivated successfully.
Dec 11 01:12:04 np0005554845 podman[219536]: 2025-12-11 06:12:04.803282449 +0000 UTC m=+0.057253155 container died de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:12:04 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b-userdata-shm.mount: Deactivated successfully.
Dec 11 01:12:04 np0005554845 systemd[1]: var-lib-containers-storage-overlay-e6ff32ee49076ab179419562afcad2976343e8b09ea42cb76e7b2f96f18bddb4-merged.mount: Deactivated successfully.
Dec 11 01:12:04 np0005554845 podman[219536]: 2025-12-11 06:12:04.840784051 +0000 UTC m=+0.094754727 container cleanup de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2)
Dec 11 01:12:04 np0005554845 systemd[1]: libpod-conmon-de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b.scope: Deactivated successfully.
Dec 11 01:12:04 np0005554845 podman[219570]: 2025-12-11 06:12:04.92691358 +0000 UTC m=+0.050672046 container remove de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.934 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d530b38-21fd-42ad-aa00-23005c72a4df]: (4, ('Thu Dec 11 06:12:04 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d (de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b)\nde60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b\nThu Dec 11 06:12:04 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d (de60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b)\nde60c52a7395be5a9cfa928039b547a2e0077aeadb69881d8df2efdceb9a424b\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.936 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fcd22e5f-c47c-407d-87ae-68a5aa8300e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.937 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1399078c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:04 np0005554845 nova_compute[187128]: 2025-12-11 06:12:04.939 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:04 np0005554845 kernel: tap1399078c-40: left promiscuous mode
Dec 11 01:12:04 np0005554845 nova_compute[187128]: 2025-12-11 06:12:04.961 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.964 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[68e94f10-42e0-4788-9b8a-33c94bc842ac]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.979 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d911a2b8-44e6-427d-8cdf-ddcda938c8f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:04 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:04.981 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1b2f8f75-85f0-437a-a90d-2ecc3759c1a7]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:05.004 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f790c1dd-572a-42c1-ab04-10914845a6ea]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 377358, 'reachable_time': 29989, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219599, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:05 np0005554845 systemd[1]: run-netns-ovnmeta\x2d1399078c\x2d4ff0\x2d437d\x2da25b\x2d2a77a741362d.mount: Deactivated successfully.
Dec 11 01:12:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:05.007 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:12:05 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:05.008 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[82334f0e-2a60-4a57-b246-5536ee9e3300]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.387 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.452 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance shutdown successfully after 3 seconds.#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.457 187132 INFO nova.virt.libvirt.driver [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance destroyed successfully.#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.463 187132 INFO nova.virt.libvirt.driver [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance destroyed successfully.#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.463 187132 DEBUG nova.virt.libvirt.vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:11:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',image_container_format='bare',image_disk_format='qcow2',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='rebuilding',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:00Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.464 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.465 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.465 187132 DEBUG os_vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.467 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.467 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f7ef811-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.469 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.470 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.472 187132 INFO os_vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5')#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.473 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Deleting instance files /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e_del#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.475 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Deletion of /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e_del complete#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.612 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.612 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating image(s)#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.613 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.613 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.614 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.614 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:05 np0005554845 nova_compute[187128]: 2025-12-11 06:12:05.614 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" acquired by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.722 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.part --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.788 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.part --force-share --output=json" returned: 0 in 0.066s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.790 187132 DEBUG nova.virt.images [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] 6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b was qcow2, converting to raw fetch_to_raw /usr/lib/python3.9/site-packages/nova/virt/images.py:242#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.791 187132 DEBUG nova.privsep.utils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Path '/var/lib/nova/instances' supports direct I/O supports_direct_io /usr/lib/python3.9/site-packages/nova/privsep/utils.py:63#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.792 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.part /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.converted execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.992 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "qemu-img convert -t none -O raw -f qcow2 /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.part /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.converted" returned: 0 in 0.200s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:06 np0005554845 nova_compute[187128]: 2025-12-11 06:12:06.996 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.converted --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.055 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7.converted --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.056 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" "released" by "nova.virt.libvirt.imagebackend.Image.cache.<locals>.fetch_func_sync" :: held 1.442s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.070 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.165 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json" returned: 0 in 0.095s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.167 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.167 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.186 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.261 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.263 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7,backing_fmt=raw /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.294 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7,backing_fmt=raw /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk 1073741824" returned: 0 in 0.031s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.295 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "b9e1575dda05624122bbe83e655f0ad40498d1c7" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.128s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.296 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.353 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/b9e1575dda05624122bbe83e655f0ad40498d1c7 --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.354 187132 DEBUG nova.virt.disk.api [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.355 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.409 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.410 187132 DEBUG nova.virt.disk.api [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.410 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.411 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Ensure instance console log exists: /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.411 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.412 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.412 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.414 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start _get_guest_xml network_info=[{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:55Z,direct_url=<?>,disk_format='qcow2',id=6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:56Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.418 187132 WARNING nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.: NotImplementedError#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.427 187132 DEBUG nova.virt.libvirt.host [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.428 187132 DEBUG nova.virt.libvirt.host [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.434 187132 DEBUG nova.virt.libvirt.host [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.435 187132 DEBUG nova.virt.libvirt.host [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.437 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.437 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:55Z,direct_url=<?>,disk_format='qcow2',id=6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img_alt',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:56Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.438 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.438 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.439 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.439 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.440 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.440 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.441 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.441 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.441 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.442 187132 DEBUG nova.virt.hardware [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.442 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'vcpu_model' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.476 187132 DEBUG nova.virt.libvirt.vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:11:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:05Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.477 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.478 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.481 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <uuid>e5327266-05a9-47da-91fc-d5cd8866fa3e</uuid>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <name>instance-0000001a</name>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-1449027617</nova:name>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:12:07</nova:creationTime>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        <nova:port uuid="1f7ef811-f5fc-4537-879a-7227b3f08154">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.6" ipVersion="4"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="serial">e5327266-05a9-47da-91fc-d5cd8866fa3e</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="uuid">e5327266-05a9-47da-91fc-d5cd8866fa3e</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:f2:99:40"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <target dev="tap1f7ef811-f5"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/console.log" append="off"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:12:07 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:12:07 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:12:07 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:12:07 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.485 187132 DEBUG nova.virt.libvirt.vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:11:41Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='rebuild_spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:05Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.485 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.486 187132 DEBUG nova.network.os_vif_util [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.486 187132 DEBUG os_vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.487 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.488 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.488 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.490 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.491 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1f7ef811-f5, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.491 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap1f7ef811-f5, col_values=(('external_ids', {'iface-id': '1f7ef811-f5fc-4537-879a-7227b3f08154', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:f2:99:40', 'vm-uuid': 'e5327266-05a9-47da-91fc-d5cd8866fa3e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.493 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:07 np0005554845 NetworkManager[55529]: <info>  [1765433527.4947] manager: (tap1f7ef811-f5): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/96)
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.496 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.505 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.507 187132 INFO os_vif [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5')#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.508 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.559 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.560 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.560 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:f2:99:40, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.560 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Using config drive#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.573 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'ec2_ids' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:07 np0005554845 nova_compute[187128]: 2025-12-11 06:12:07.603 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'keypairs' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.539 187132 DEBUG nova.compute.manager [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.539 187132 DEBUG oslo_concurrency.lockutils [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.539 187132 DEBUG oslo_concurrency.lockutils [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.540 187132 DEBUG oslo_concurrency.lockutils [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.540 187132 DEBUG nova.compute.manager [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.540 187132 WARNING nova.compute.manager [req-d1fec7f5-e68a-47eb-943f-9cd1f9f3d59b req-a8c78ccb-edc7-4627-8cc8-03577cdf6509 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state rebuild_spawning.#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.774 187132 INFO nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Creating config drive at /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.783 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgdfpyswc execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:08 np0005554845 nova_compute[187128]: 2025-12-11 06:12:08.914 187132 DEBUG oslo_concurrency.processutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpgdfpyswc" returned: 0 in 0.130s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:08 np0005554845 kernel: tap1f7ef811-f5: entered promiscuous mode
Dec 11 01:12:08 np0005554845 NetworkManager[55529]: <info>  [1765433528.9960] manager: (tap1f7ef811-f5): new Tun device (/org/freedesktop/NetworkManager/Devices/97)
Dec 11 01:12:09 np0005554845 systemd-udevd[219648]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:12:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:09Z|00196|binding|INFO|Claiming lport 1f7ef811-f5fc-4537-879a-7227b3f08154 for this chassis.
Dec 11 01:12:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:09Z|00197|binding|INFO|1f7ef811-f5fc-4537-879a-7227b3f08154: Claiming fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.044 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:09Z|00198|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 ovn-installed in OVS
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.060 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 NetworkManager[55529]: <info>  [1765433529.0660] device (tap1f7ef811-f5): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:12:09 np0005554845 NetworkManager[55529]: <info>  [1765433529.0667] device (tap1f7ef811-f5): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:12:09 np0005554845 systemd-machined[153381]: New machine qemu-14-instance-0000001a.
Dec 11 01:12:09 np0005554845 systemd[1]: Started Virtual Machine qemu-14-instance-0000001a.
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.137 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:99:40 10.100.0.6'], port_security=['fa:16:3e:f2:99:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e5327266-05a9-47da-91fc-d5cd8866fa3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1399078c-4ff0-437d-a25b-2a77a741362d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'dc718b16-d267-4010-9f1f-c5510585bdeb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.216'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d50d5fec-d774-4d77-a4b2-f05764d3d543, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=1f7ef811-f5fc-4537-879a-7227b3f08154) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:12:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:09Z|00199|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 up in Southbound
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.138 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 1f7ef811-f5fc-4537-879a-7227b3f08154 in datapath 1399078c-4ff0-437d-a25b-2a77a741362d bound to our chassis#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.139 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 1399078c-4ff0-437d-a25b-2a77a741362d#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.152 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[052b71ce-1d3c-4754-898e-12b287c03f3c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.153 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap1399078c-41 in ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.154 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap1399078c-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.154 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d138dd55-ee86-4f57-b8e3-23f4703d1873]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.155 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7517d09f-2681-473a-a748-25bb5bb43373]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.166 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[4accbefe-c06d-4dfa-a1c6-12cf7e13dabc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.189 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9ad8b6e3-f9d2-47c6-b21f-5560241ce01e]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.231 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0b2e3a39-f881-40e6-b97a-b98f889737b0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 systemd-udevd[219652]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:12:09 np0005554845 NetworkManager[55529]: <info>  [1765433529.2386] manager: (tap1399078c-40): new Veth device (/org/freedesktop/NetworkManager/Devices/98)
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.240 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[88e72f93-36d7-4bc1-89a3-4daeb037a04d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.273 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[d1d87728-f4f6-483b-a12c-33a23fd37edf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.276 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[cb71d013-5192-4243-9953-2420919dbd6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 NetworkManager[55529]: <info>  [1765433529.2958] device (tap1399078c-40): carrier: link connected
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.301 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[c74bf572-3f40-4631-8b01-b3a8196a687b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.316 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cb679bcd-47f7-42fe-8aae-6d535ebec09b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1399078c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:71:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380394, 'reachable_time': 16607, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 219684, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.331 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[60e51b21-6c96-465f-82aa-eadcfa2ea729]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed5:7199'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 380394, 'tstamp': 380394}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 219685, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.346 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[78f362d3-a298-4053-bfc1-b0ae323b034c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap1399078c-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d5:71:99'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 58], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380394, 'reachable_time': 16607, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 219686, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.375 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1c640b80-a011-43ec-903d-d6890dbe88cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.446 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[44b279c3-4367-4317-8dcc-123baed3f5f5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.448 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1399078c-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.448 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.449 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap1399078c-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.451 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 NetworkManager[55529]: <info>  [1765433529.4520] manager: (tap1399078c-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/99)
Dec 11 01:12:09 np0005554845 kernel: tap1399078c-40: entered promiscuous mode
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.453 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.456 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap1399078c-40, col_values=(('external_ids', {'iface-id': '7bb85b09-e8e0-47cb-9628-df3e8460ffff'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.457 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:09Z|00200|binding|INFO|Releasing lport 7bb85b09-e8e0-47cb-9628-df3e8460ffff from this chassis (sb_readonly=0)
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.459 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.460 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.461 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8f02285d-1fa5-4c14-b3be-2a890c3b3544]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.462 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-1399078c-4ff0-437d-a25b-2a77a741362d
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/1399078c-4ff0-437d-a25b-2a77a741362d.pid.haproxy
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 1399078c-4ff0-437d-a25b-2a77a741362d
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:12:09 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:09.463 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'env', 'PROCESS_TAG=haproxy-1399078c-4ff0-437d-a25b-2a77a741362d', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/1399078c-4ff0-437d-a25b-2a77a741362d.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.475 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.502 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Removed pending event for e5327266-05a9-47da-91fc-d5cd8866fa3e due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.503 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433529.5016115, e5327266-05a9-47da-91fc-d5cd8866fa3e => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.503 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.506 187132 DEBUG nova.compute.manager [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.507 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.510 187132 INFO nova.virt.libvirt.driver [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance spawned successfully.#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.510 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.530 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.535 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.535 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.535 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.536 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.537 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.537 187132 DEBUG nova.virt.libvirt.driver [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.541 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.573 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.573 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433529.506184, e5327266-05a9-47da-91fc-d5cd8866fa3e => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.573 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Started (Lifecycle Event)#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.588 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.595 187132 DEBUG nova.compute.manager [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.595 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Synchronizing instance power state after lifecycle event "Started"; current vm_state: active, current task_state: rebuild_spawning, current DB power_state: 1, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.624 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] During sync_power_state the instance has a pending task (rebuild_spawning). Skip.#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.658 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.659 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.659 187132 DEBUG nova.objects.instance [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Trying to apply a migration context that does not seem to be set for this instance apply_migration_context /usr/lib/python3.9/site-packages/nova/objects/instance.py:1032#033[00m
Dec 11 01:12:09 np0005554845 nova_compute[187128]: 2025-12-11 06:12:09.718 187132 DEBUG oslo_concurrency.lockutils [None req-44ac573d-f16c-4a04-b1d2-183d23ee8c29 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.finish_evacuation" :: held 0.059s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:09 np0005554845 podman[219725]: 2025-12-11 06:12:09.915503352 +0000 UTC m=+0.082635913 container create e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:12:09 np0005554845 systemd[1]: Started libpod-conmon-e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935.scope.
Dec 11 01:12:09 np0005554845 podman[219725]: 2025-12-11 06:12:09.875657486 +0000 UTC m=+0.042790107 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:12:09 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:12:09 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/68e10a2eeca2d0fcfc7862b7c7ea72c3960d44259209382fbdade4b94700beed/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:12:10 np0005554845 podman[219725]: 2025-12-11 06:12:10.010215486 +0000 UTC m=+0.177348047 container init e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:12:10 np0005554845 podman[219725]: 2025-12-11 06:12:10.017162667 +0000 UTC m=+0.184295228 container start e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:12:10 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [NOTICE]   (219745) : New worker (219747) forked
Dec 11 01:12:10 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [NOTICE]   (219745) : Loading success.
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.390 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.612 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.613 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.613 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.613 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.614 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.614 187132 WARNING nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.614 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.615 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.615 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.615 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.615 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.616 187132 WARNING nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.616 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.616 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.617 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.617 187132 DEBUG oslo_concurrency.lockutils [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.617 187132 DEBUG nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:10 np0005554845 nova_compute[187128]: 2025-12-11 06:12:10.617 187132 WARNING nova.compute.manager [req-1ed0d0d0-036f-4b77-a3ef-6eb399fcbdaf req-e5fd62ed-0e0c-4871-9650-c0c6afa25ea5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:12:12 np0005554845 nova_compute[187128]: 2025-12-11 06:12:12.494 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:12 np0005554845 nova_compute[187128]: 2025-12-11 06:12:12.832 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:15 np0005554845 podman[219756]: 2025-12-11 06:12:15.137636605 +0000 UTC m=+0.062139929 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:12:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:15.146 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6a:a2:c8 2001:db8:0:1:f816:3eff:fe6a:a2c8 2001:db8::f816:3eff:fe6a:a2c8'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6a:a2c8/64 2001:db8::f816:3eff:fe6a:a2c8/64', 'neutron:device_id': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2aa68e22-5010-4f13-b0a0-aaa483f0ac60, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2646f808-9283-4885-ac09-2ea314894a12) old=Port_Binding(mac=['fa:16:3e:6a:a2:c8 2001:db8::f816:3eff:fe6a:a2c8'], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe6a:a2c8/64', 'neutron:device_id': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:12:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:15.147 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2646f808-9283-4885-ac09-2ea314894a12 in datapath d1d0291b-2b4e-477b-a989-16bcd5f034d4 updated#033[00m
Dec 11 01:12:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:15.149 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1d0291b-2b4e-477b-a989-16bcd5f034d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:12:15 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:15.150 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[547265ad-b367-4e4c-82fa-4e8aba70c915]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:15 np0005554845 nova_compute[187128]: 2025-12-11 06:12:15.393 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:17 np0005554845 nova_compute[187128]: 2025-12-11 06:12:17.538 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:20 np0005554845 nova_compute[187128]: 2025-12-11 06:12:20.395 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:21 np0005554845 podman[219781]: 2025-12-11 06:12:21.184831188 +0000 UTC m=+0.106853219 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:12:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:22Z|00026|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:12:22 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:22Z|00027|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:f2:99:40 10.100.0.6
Dec 11 01:12:22 np0005554845 nova_compute[187128]: 2025-12-11 06:12:22.541 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:24 np0005554845 podman[219817]: 2025-12-11 06:12:24.114890362 +0000 UTC m=+0.051556529 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:12:24 np0005554845 podman[219818]: 2025-12-11 06:12:24.189293008 +0000 UTC m=+0.116274738 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 01:12:25 np0005554845 nova_compute[187128]: 2025-12-11 06:12:25.397 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:26 np0005554845 podman[219861]: 2025-12-11 06:12:26.129837431 +0000 UTC m=+0.065569585 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 11 01:12:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:26.224 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:26.225 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:26.226 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.714 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.715 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.789 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.883 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json" returned: 0 in 0.094s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.885 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:26 np0005554845 nova_compute[187128]: 2025-12-11 06:12:26.956 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e/disk --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.185 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.187 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5535MB free_disk=73.26716613769531GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.187 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.188 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.274 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance e5327266-05a9-47da-91fc-d5cd8866fa3e actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.275 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.275 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.326 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.341 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.364 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.364 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:27 np0005554845 nova_compute[187128]: 2025-12-11 06:12:27.543 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.359 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.360 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.361 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.361 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.546 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.547 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.547 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:12:28 np0005554845 nova_compute[187128]: 2025-12-11 06:12:28.547 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:29 np0005554845 nova_compute[187128]: 2025-12-11 06:12:29.568 187132 INFO nova.compute.manager [None req-734da8fe-cf86-4521-a681-7231ddab275c 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Get console output#033[00m
Dec 11 01:12:29 np0005554845 nova_compute[187128]: 2025-12-11 06:12:29.575 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:12:30 np0005554845 podman[219890]: 2025-12-11 06:12:30.140124569 +0000 UTC m=+0.069765419 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:12:30 np0005554845 podman[219891]: 2025-12-11 06:12:30.158143315 +0000 UTC m=+0.075313942 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.expose-services=, version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, managed_by=edpm_ansible, vcs-type=git, container_name=openstack_network_exporter, io.buildah.version=1.33.7, name=ubi9-minimal, config_id=edpm, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., maintainer=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.169 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.186 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.186 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.187 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.188 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.188 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.188 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.189 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:12:30 np0005554845 nova_compute[187128]: 2025-12-11 06:12:30.399 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.515 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.708 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.709 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.709 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.710 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.710 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.711 187132 INFO nova.compute.manager [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Terminating instance#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.712 187132 DEBUG nova.compute.manager [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:12:31 np0005554845 kernel: tap1f7ef811-f5 (unregistering): left promiscuous mode
Dec 11 01:12:31 np0005554845 NetworkManager[55529]: <info>  [1765433551.7501] device (tap1f7ef811-f5): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:12:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:31Z|00201|binding|INFO|Releasing lport 1f7ef811-f5fc-4537-879a-7227b3f08154 from this chassis (sb_readonly=0)
Dec 11 01:12:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:31Z|00202|binding|INFO|Setting lport 1f7ef811-f5fc-4537-879a-7227b3f08154 down in Southbound
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.753 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:12:31Z|00203|binding|INFO|Removing iface tap1f7ef811-f5 ovn-installed in OVS
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.755 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:31.763 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:f2:99:40 10.100.0.6'], port_security=['fa:16:3e:f2:99:40 10.100.0.6'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.6/28', 'neutron:device_id': 'e5327266-05a9-47da-91fc-d5cd8866fa3e', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-1399078c-4ff0-437d-a25b-2a77a741362d', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '6', 'neutron:security_group_ids': 'dc718b16-d267-4010-9f1f-c5510585bdeb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=d50d5fec-d774-4d77-a4b2-f05764d3d543, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=1f7ef811-f5fc-4537-879a-7227b3f08154) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:12:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:31.765 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 1f7ef811-f5fc-4537-879a-7227b3f08154 in datapath 1399078c-4ff0-437d-a25b-2a77a741362d unbound from our chassis#033[00m
Dec 11 01:12:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:31.768 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 1399078c-4ff0-437d-a25b-2a77a741362d, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:12:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:31.769 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c7fd9655-6932-4c1d-b8af-cb33feebeb9a]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:31.770 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d namespace which is not needed anymore#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.777 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:31 np0005554845 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001a.scope: Deactivated successfully.
Dec 11 01:12:31 np0005554845 systemd[1]: machine-qemu\x2d14\x2dinstance\x2d0000001a.scope: Consumed 13.173s CPU time.
Dec 11 01:12:31 np0005554845 systemd-machined[153381]: Machine qemu-14-instance-0000001a terminated.
Dec 11 01:12:31 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [NOTICE]   (219745) : haproxy version is 2.8.14-c23fe91
Dec 11 01:12:31 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [NOTICE]   (219745) : path to executable is /usr/sbin/haproxy
Dec 11 01:12:31 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [WARNING]  (219745) : Exiting Master process...
Dec 11 01:12:31 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [ALERT]    (219745) : Current worker (219747) exited with code 143 (Terminated)
Dec 11 01:12:31 np0005554845 neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d[219741]: [WARNING]  (219745) : All workers exited. Exiting... (0)
Dec 11 01:12:31 np0005554845 systemd[1]: libpod-e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935.scope: Deactivated successfully.
Dec 11 01:12:31 np0005554845 podman[219960]: 2025-12-11 06:12:31.93436616 +0000 UTC m=+0.048532176 container died e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:12:31 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935-userdata-shm.mount: Deactivated successfully.
Dec 11 01:12:31 np0005554845 systemd[1]: var-lib-containers-storage-overlay-68e10a2eeca2d0fcfc7862b7c7ea72c3960d44259209382fbdade4b94700beed-merged.mount: Deactivated successfully.
Dec 11 01:12:31 np0005554845 podman[219960]: 2025-12-11 06:12:31.975529431 +0000 UTC m=+0.089695457 container cleanup e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.990 187132 INFO nova.virt.libvirt.driver [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Instance destroyed successfully.#033[00m
Dec 11 01:12:31 np0005554845 nova_compute[187128]: 2025-12-11 06:12:31.990 187132 DEBUG nova.objects.instance [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid e5327266-05a9-47da-91fc-d5cd8866fa3e obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:32 np0005554845 systemd[1]: libpod-conmon-e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935.scope: Deactivated successfully.
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.009 187132 DEBUG nova.virt.libvirt.vif [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=True,config_drive='True',created_at=2025-12-11T06:11:29Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-1449027617',display_name='tempest-TestNetworkAdvancedServerOps-server-1449027617',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-1449027617',id=26,image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBIYjzi0ZhE9Kf1ZEwkmM5x6WL85TnTMwYO++Dc14hBCsOp2W8k05a6CYUDsbMTvUgGW/za/aDw6E0zAJuZyOR7KgXVKhJbAmj8ilH2QdC3Taix32uGnPmVik3hsotRzG1A==',key_name='tempest-TestNetworkAdvancedServerOps-1716433215',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:12:09Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-s3zw0n3v',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',clean_attempts='1',image_base_image_ref='6bafa3fe-a2ea-4f5b-9a20-33fb3d4c5c7b',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:12:09Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e5327266-05a9-47da-91fc-d5cd8866fa3e,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.010 187132 DEBUG nova.network.os_vif_util [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.216", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.011 187132 DEBUG nova.network.os_vif_util [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.012 187132 DEBUG os_vif [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.013 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.014 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1f7ef811-f5, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.015 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.019 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.022 187132 INFO os_vif [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:f2:99:40,bridge_name='br-int',has_traffic_filtering=True,id=1f7ef811-f5fc-4537-879a-7227b3f08154,network=Network(1399078c-4ff0-437d-a25b-2a77a741362d),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap1f7ef811-f5')#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.023 187132 INFO nova.virt.libvirt.driver [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Deleting instance files /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e_del#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.024 187132 INFO nova.virt.libvirt.driver [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Deletion of /var/lib/nova/instances/e5327266-05a9-47da-91fc-d5cd8866fa3e_del complete#033[00m
Dec 11 01:12:32 np0005554845 podman[220005]: 2025-12-11 06:12:32.050656358 +0000 UTC m=+0.049214945 container remove e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.055 187132 DEBUG nova.compute.manager [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.056 187132 DEBUG nova.compute.manager [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing instance network info cache due to event network-changed-1f7ef811-f5fc-4537-879a-7227b3f08154. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.057 187132 DEBUG oslo_concurrency.lockutils [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.057 187132 DEBUG oslo_concurrency.lockutils [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.058 187132 DEBUG nova.network.neutron [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Refreshing network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.058 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2d51d281-5a56-40f3-b621-9bbcda44fb02]: (4, ('Thu Dec 11 06:12:31 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d (e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935)\ne2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935\nThu Dec 11 06:12:31 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d (e2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935)\ne2329f41d46074d20270d931668d755e9fa574cf2b40b6af4c3d2771cbb90935\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.060 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3a8a0928-4153-4ec6-a202-e5baad049141]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.062 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap1399078c-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:32 np0005554845 kernel: tap1399078c-40: left promiscuous mode
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.064 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.080 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.083 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[38e74e5d-ae42-4392-adc9-081786e85433]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.102 187132 INFO nova.compute.manager [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.103 187132 DEBUG oslo.service.loopingcall [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.104 187132 DEBUG nova.compute.manager [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.104 187132 DEBUG nova.network.neutron [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.104 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[808bd6ef-adaf-44fc-aaa1-65a7d8d10d05]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.105 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c91afcbd-ade8-40de-b8a8-a0c1be20723c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.133 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1aeb309a-8df4-4a76-b9bf-c09f9825287c]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 380386, 'reachable_time': 39660, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220023, 'error': None, 'target': 'ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 systemd[1]: run-netns-ovnmeta\x2d1399078c\x2d4ff0\x2d437d\x2da25b\x2d2a77a741362d.mount: Deactivated successfully.
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.135 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-1399078c-4ff0-437d-a25b-2a77a741362d deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:12:32 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:32.136 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[aff83a48-41e0-4ea7-a4f9-5fea9265d23e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.432 187132 DEBUG nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.433 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.433 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.433 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.433 187132 DEBUG nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-unplugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG oslo_concurrency.lockutils [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.434 187132 DEBUG nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] No waiting events found dispatching network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:12:32 np0005554845 nova_compute[187128]: 2025-12-11 06:12:32.435 187132 WARNING nova.compute.manager [req-8fe9cccd-218b-4fb6-87f1-11ef9a6d1e65 req-08979e74-53ff-4add-b2b3-3176544739bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received unexpected event network-vif-plugged-1f7ef811-f5fc-4537-879a-7227b3f08154 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:12:33 np0005554845 nova_compute[187128]: 2025-12-11 06:12:33.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:33 np0005554845 nova_compute[187128]: 2025-12-11 06:12:33.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.109 187132 DEBUG nova.network.neutron [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.141 187132 INFO nova.compute.manager [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Took 3.04 seconds to deallocate network for instance.#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.186 187132 DEBUG nova.compute.manager [req-edf3abd4-486e-4239-ae18-93e645ca1261 req-8ebd0353-4cfb-4d2c-8514-5aebe356dcca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Received event network-vif-deleted-1f7ef811-f5fc-4537-879a-7227b3f08154 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.190 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.190 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.255 187132 DEBUG nova.compute.provider_tree [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.272 187132 DEBUG nova.scheduler.client.report [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.297 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.107s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.329 187132 INFO nova.scheduler.client.report [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocations for instance e5327266-05a9-47da-91fc-d5cd8866fa3e#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.401 187132 DEBUG oslo_concurrency.lockutils [None req-0e7d4a27-f2b3-45a9-8b93-7dfebbb117b1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e5327266-05a9-47da-91fc-d5cd8866fa3e" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.692s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.402 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.575 187132 DEBUG nova.network.neutron [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updated VIF entry in instance network info cache for port 1f7ef811-f5fc-4537-879a-7227b3f08154. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.576 187132 DEBUG nova.network.neutron [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Updating instance_info_cache with network_info: [{"id": "1f7ef811-f5fc-4537-879a-7227b3f08154", "address": "fa:16:3e:f2:99:40", "network": {"id": "1399078c-4ff0-437d-a25b-2a77a741362d", "bridge": "br-int", "label": "tempest-network-smoke--749450354", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.6", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap1f7ef811-f5", "ovs_interfaceid": "1f7ef811-f5fc-4537-879a-7227b3f08154", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:12:35 np0005554845 nova_compute[187128]: 2025-12-11 06:12:35.594 187132 DEBUG oslo_concurrency.lockutils [req-75e850c6-1118-4896-aa7a-94f303729de1 req-4347f83a-e600-4298-8e7f-5e1b49c9ca89 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e5327266-05a9-47da-91fc-d5cd8866fa3e" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:12:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:36.788 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=11, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=10) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:12:36 np0005554845 nova_compute[187128]: 2025-12-11 06:12:36.789 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:36.791 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:12:37 np0005554845 nova_compute[187128]: 2025-12-11 06:12:37.017 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:39 np0005554845 nova_compute[187128]: 2025-12-11 06:12:39.415 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:39 np0005554845 nova_compute[187128]: 2025-12-11 06:12:39.596 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:40 np0005554845 nova_compute[187128]: 2025-12-11 06:12:40.403 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:42 np0005554845 nova_compute[187128]: 2025-12-11 06:12:42.020 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:12:42.795 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '11'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:12:45 np0005554845 nova_compute[187128]: 2025-12-11 06:12:45.405 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:46 np0005554845 podman[220026]: 2025-12-11 06:12:46.12070349 +0000 UTC m=+0.053370728 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:12:46 np0005554845 nova_compute[187128]: 2025-12-11 06:12:46.987 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433551.9861994, e5327266-05a9-47da-91fc-d5cd8866fa3e => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:12:46 np0005554845 nova_compute[187128]: 2025-12-11 06:12:46.987 187132 INFO nova.compute.manager [-] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:12:47 np0005554845 nova_compute[187128]: 2025-12-11 06:12:47.003 187132 DEBUG nova.compute.manager [None req-b40648e1-7604-42b8-9647-1d8901943def - - - - - -] [instance: e5327266-05a9-47da-91fc-d5cd8866fa3e] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:12:47 np0005554845 nova_compute[187128]: 2025-12-11 06:12:47.022 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:50 np0005554845 nova_compute[187128]: 2025-12-11 06:12:50.407 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:52 np0005554845 nova_compute[187128]: 2025-12-11 06:12:52.023 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:52 np0005554845 podman[220050]: 2025-12-11 06:12:52.128677945 +0000 UTC m=+0.061988636 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:12:55 np0005554845 podman[220070]: 2025-12-11 06:12:55.149354831 +0000 UTC m=+0.062773128 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:12:55 np0005554845 podman[220071]: 2025-12-11 06:12:55.189465304 +0000 UTC m=+0.105274386 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:12:55 np0005554845 nova_compute[187128]: 2025-12-11 06:12:55.409 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:57 np0005554845 nova_compute[187128]: 2025-12-11 06:12:57.025 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:12:57 np0005554845 podman[220115]: 2025-12-11 06:12:57.137303517 +0000 UTC m=+0.071454886 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, container_name=multipathd, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.410 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.411 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.437 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.524 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.525 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.531 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.531 187132 INFO nova.compute.claims [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.651 187132 DEBUG nova.compute.provider_tree [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.704 187132 DEBUG nova.scheduler.client.report [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.741 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.216s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.742 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.796 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.797 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.819 187132 INFO nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.840 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.957 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.958 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.958 187132 INFO nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Creating image(s)#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.959 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.959 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.960 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:58 np0005554845 nova_compute[187128]: 2025-12-11 06:12:58.974 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.036 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.037 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.038 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.049 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.070 187132 DEBUG nova.policy [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.112 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.113 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.176 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk 1073741824" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.177 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.140s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.178 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.239 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.241 187132 DEBUG nova.virt.disk.api [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.242 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.302 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json" returned: 0 in 0.060s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.303 187132 DEBUG nova.virt.disk.api [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.304 187132 DEBUG nova.objects.instance [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.329 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.330 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Ensure instance console log exists: /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.330 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.331 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:12:59 np0005554845 nova_compute[187128]: 2025-12-11 06:12:59.332 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:00 np0005554845 nova_compute[187128]: 2025-12-11 06:13:00.412 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:00 np0005554845 nova_compute[187128]: 2025-12-11 06:13:00.842 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Successfully created port: 4afc532b-f213-41cf-9252-65894783ee04 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:13:01 np0005554845 podman[220150]: 2025-12-11 06:13:01.126319821 +0000 UTC m=+0.057477341 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:13:01 np0005554845 podman[220151]: 2025-12-11 06:13:01.148407849 +0000 UTC m=+0.070919152 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, vendor=Red Hat, Inc., version=9.6, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.expose-services=)
Dec 11 01:13:02 np0005554845 nova_compute[187128]: 2025-12-11 06:13:02.027 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:02 np0005554845 nova_compute[187128]: 2025-12-11 06:13:02.127 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Successfully created port: 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:13:03 np0005554845 nova_compute[187128]: 2025-12-11 06:13:03.967 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Successfully updated port: 4afc532b-f213-41cf-9252-65894783ee04 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.069 187132 DEBUG nova.compute.manager [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-changed-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.070 187132 DEBUG nova.compute.manager [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing instance network info cache due to event network-changed-4afc532b-f213-41cf-9252-65894783ee04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.070 187132 DEBUG oslo_concurrency.lockutils [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.070 187132 DEBUG oslo_concurrency.lockutils [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.071 187132 DEBUG nova.network.neutron [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing network info cache for port 4afc532b-f213-41cf-9252-65894783ee04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.307 187132 DEBUG nova.network.neutron [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.699 187132 DEBUG nova.network.neutron [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:04 np0005554845 nova_compute[187128]: 2025-12-11 06:13:04.719 187132 DEBUG oslo_concurrency.lockutils [req-a943bf4a-d269-4ef2-be2c-5096a5daa320 req-94c26d5a-6d36-40fd-a9f3-48c7e94de826 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:05 np0005554845 nova_compute[187128]: 2025-12-11 06:13:05.079 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Successfully updated port: 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:13:05 np0005554845 nova_compute[187128]: 2025-12-11 06:13:05.095 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:05 np0005554845 nova_compute[187128]: 2025-12-11 06:13:05.096 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:05 np0005554845 nova_compute[187128]: 2025-12-11 06:13:05.096 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:13:05 np0005554845 nova_compute[187128]: 2025-12-11 06:13:05.414 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:06 np0005554845 nova_compute[187128]: 2025-12-11 06:13:06.443 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:13:06 np0005554845 nova_compute[187128]: 2025-12-11 06:13:06.621 187132 DEBUG nova.compute.manager [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-changed-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:06 np0005554845 nova_compute[187128]: 2025-12-11 06:13:06.622 187132 DEBUG nova.compute.manager [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing instance network info cache due to event network-changed-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:13:06 np0005554845 nova_compute[187128]: 2025-12-11 06:13:06.622 187132 DEBUG oslo_concurrency.lockutils [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:07 np0005554845 nova_compute[187128]: 2025-12-11 06:13:07.030 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:10 np0005554845 nova_compute[187128]: 2025-12-11 06:13:10.416 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.513 187132 DEBUG nova.network.neutron [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.543 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.544 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance network_info: |[{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.545 187132 DEBUG oslo_concurrency.lockutils [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.545 187132 DEBUG nova.network.neutron [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing network info cache for port 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.550 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Start _get_guest_xml network_info=[{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.557 187132 WARNING nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.566 187132 DEBUG nova.virt.libvirt.host [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.567 187132 DEBUG nova.virt.libvirt.host [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.570 187132 DEBUG nova.virt.libvirt.host [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.570 187132 DEBUG nova.virt.libvirt.host [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.572 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.572 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.572 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.572 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.573 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.573 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.573 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.573 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.574 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.574 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.574 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.574 187132 DEBUG nova.virt.hardware [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.578 187132 DEBUG nova.virt.libvirt.vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.578 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.579 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.579 187132 DEBUG nova.virt.libvirt.vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.580 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.580 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.581 187132 DEBUG nova.objects.instance [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.597 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <uuid>b7fcd131-1c40-4ddc-9d8a-6a9b503cb773</uuid>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <name>instance-0000001f</name>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-1686231499</nova:name>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:13:11</nova:creationTime>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:port uuid="4afc532b-f213-41cf-9252-65894783ee04">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        <nova:port uuid="69f4d5c0-8f90-4321-8f66-92eb4d8d49b3">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe6b:893c" ipVersion="6"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe6b:893c" ipVersion="6"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="serial">b7fcd131-1c40-4ddc-9d8a-6a9b503cb773</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="uuid">b7fcd131-1c40-4ddc-9d8a-6a9b503cb773</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.config"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:af:0b:4a"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <target dev="tap4afc532b-f2"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:6b:89:3c"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <target dev="tap69f4d5c0-8f"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/console.log" append="off"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:13:11 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:13:11 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:13:11 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:13:11 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.599 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Preparing to wait for external event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.599 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.599 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.599 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.600 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Preparing to wait for external event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.600 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.600 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.600 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.601 187132 DEBUG nova.virt.libvirt.vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.601 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.602 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.602 187132 DEBUG os_vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.602 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.603 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.603 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.606 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.606 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4afc532b-f2, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.606 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4afc532b-f2, col_values=(('external_ids', {'iface-id': '4afc532b-f213-41cf-9252-65894783ee04', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:af:0b:4a', 'vm-uuid': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.608 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 NetworkManager[55529]: <info>  [1765433591.6091] manager: (tap4afc532b-f2): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/100)
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.610 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.616 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.617 187132 INFO os_vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2')#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.618 187132 DEBUG nova.virt.libvirt.vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:12:58Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.618 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.619 187132 DEBUG nova.network.os_vif_util [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.619 187132 DEBUG os_vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.619 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.620 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.620 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.623 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.623 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap69f4d5c0-8f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.623 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap69f4d5c0-8f, col_values=(('external_ids', {'iface-id': '69f4d5c0-8f90-4321-8f66-92eb4d8d49b3', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:6b:89:3c', 'vm-uuid': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.625 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 NetworkManager[55529]: <info>  [1765433591.6264] manager: (tap69f4d5c0-8f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/101)
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.627 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.631 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.632 187132 INFO os_vif [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f')#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.689 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.689 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.690 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:af:0b:4a, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.690 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:6b:89:3c, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:13:11 np0005554845 nova_compute[187128]: 2025-12-11 06:13:11.691 187132 INFO nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Using config drive#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.319 187132 INFO nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Creating config drive at /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.config#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.326 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwa6h75i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.469 187132 DEBUG oslo_concurrency.processutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmphwa6h75i" returned: 0 in 0.143s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.5588] manager: (tap4afc532b-f2): new Tun device (/org/freedesktop/NetworkManager/Devices/102)
Dec 11 01:13:12 np0005554845 kernel: tap4afc532b-f2: entered promiscuous mode
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00204|binding|INFO|Claiming lport 4afc532b-f213-41cf-9252-65894783ee04 for this chassis.
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.561 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00205|binding|INFO|4afc532b-f213-41cf-9252-65894783ee04: Claiming fa:16:3e:af:0b:4a 10.100.0.10
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.576 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 kernel: tap69f4d5c0-8f: entered promiscuous mode
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.5823] manager: (tap69f4d5c0-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/103)
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.581 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.5849] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/104)
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.5857] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/105)
Dec 11 01:13:12 np0005554845 systemd-udevd[220224]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:13:12 np0005554845 systemd-udevd[220223]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.595 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:0b:4a 10.100.0.10'], port_security=['fa:16:3e:af:0b:4a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63692175-a3b4-4228-86f4-602a703ce14b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '68d01dd1-67cf-4b05-b3d1-1764b2624dfa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3d0efe-c96c-491b-ac4c-78adea640873, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4afc532b-f213-41cf-9252-65894783ee04) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.597 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4afc532b-f213-41cf-9252-65894783ee04 in datapath 63692175-a3b4-4228-86f4-602a703ce14b bound to our chassis#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.598 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 63692175-a3b4-4228-86f4-602a703ce14b#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.6206] device (tap4afc532b-f2): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.6221] device (tap4afc532b-f2): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.6253] device (tap69f4d5c0-8f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.6269] device (tap69f4d5c0-8f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.625 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2d407803-da88-4747-ae04-297bf77511ab]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.630 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap63692175-a1 in ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.633 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap63692175-a0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.633 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[14f8fc41-6700-4401-a974-4de2e9851ad0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.634 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f637e34c-b414-4867-8c87-bb5afd5df8b1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 systemd-machined[153381]: New machine qemu-15-instance-0000001f.
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.654 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[77543950-eb9f-4c19-b6ba-f1caf11d4972]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.682 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5cae7275-5ba0-41f8-ad97-fae91a28b017]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 systemd[1]: Started Virtual Machine qemu-15-instance-0000001f.
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.723 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[971e628c-ed6e-4214-9b9b-119574fa3a4b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.734 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.734 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8be5077a-3cd4-4235-bacb-b9b6775bcf38]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.738 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.7440] manager: (tap63692175-a0): new Veth device (/org/freedesktop/NetworkManager/Devices/106)
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00206|binding|INFO|Claiming lport 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 for this chassis.
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00207|binding|INFO|69f4d5c0-8f90-4321-8f66-92eb4d8d49b3: Claiming fa:16:3e:6b:89:3c 2001:db8:0:1:f816:3eff:fe6b:893c 2001:db8::f816:3eff:fe6b:893c
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00208|binding|INFO|Setting lport 4afc532b-f213-41cf-9252-65894783ee04 ovn-installed in OVS
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.761 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.766 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:89:3c 2001:db8:0:1:f816:3eff:fe6b:893c 2001:db8::f816:3eff:fe6b:893c'], port_security=['fa:16:3e:6b:89:3c 2001:db8:0:1:f816:3eff:fe6b:893c 2001:db8::f816:3eff:fe6b:893c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6b:893c/64 2001:db8::f816:3eff:fe6b:893c/64', 'neutron:device_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '68d01dd1-67cf-4b05-b3d1-1764b2624dfa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2aa68e22-5010-4f13-b0a0-aaa483f0ac60, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.766 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[502c7b37-7d83-4b1f-ba4c-f9c871ccba9b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00209|binding|INFO|Setting lport 4afc532b-f213-41cf-9252-65894783ee04 up in Southbound
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.769 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[30a44625-0ed2-4d97-bb66-276cafe8798e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00210|binding|INFO|Setting lport 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 ovn-installed in OVS
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00211|binding|INFO|Setting lport 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 up in Southbound
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.779 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.7881] device (tap63692175-a0): carrier: link connected
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.791 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e32de07a-acaa-4118-ae1d-6420bb1138c7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.806 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[01dea321-f39b-4255-bf05-b058e0c547eb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63692175-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:85:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386743, 'reachable_time': 44950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220260, 'error': None, 'target': 'ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.821 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[876a1518-0ccc-4510-8503-d03cd16334c0]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:8506'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386743, 'tstamp': 386743}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220261, 'error': None, 'target': 'ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.835 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3b3d0019-2f1a-477e-9739-2e79af1990a3]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap63692175-a1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:85:06'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 62], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386743, 'reachable_time': 44950, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220262, 'error': None, 'target': 'ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.868 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f729b12f-40ef-483a-abb0-c4e5811ee825]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.923 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[62745170-e3e7-4df1-a625-9252b652d041]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.924 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63692175-a0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.925 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.925 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63692175-a0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:12 np0005554845 kernel: tap63692175-a0: entered promiscuous mode
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.926 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 NetworkManager[55529]: <info>  [1765433592.9275] manager: (tap63692175-a0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/107)
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.929 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.933 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap63692175-a0, col_values=(('external_ids', {'iface-id': '58d1a3bc-768a-4760-a4ef-4164babedb84'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.934 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:12Z|00212|binding|INFO|Releasing lport 58d1a3bc-768a-4760-a4ef-4164babedb84 from this chassis (sb_readonly=0)
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.935 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.937 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/63692175-a3b4-4228-86f4-602a703ce14b.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/63692175-a3b4-4228-86f4-602a703ce14b.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.938 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c8b0cb77-1033-451b-9f45-e26d699cee67]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.938 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-63692175-a3b4-4228-86f4-602a703ce14b
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/63692175-a3b4-4228-86f4-602a703ce14b.pid.haproxy
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 63692175-a3b4-4228-86f4-602a703ce14b
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:13:12 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:12.939 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b', 'env', 'PROCESS_TAG=haproxy-63692175-a3b4-4228-86f4-602a703ce14b', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/63692175-a3b4-4228-86f4-602a703ce14b.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:13:12 np0005554845 nova_compute[187128]: 2025-12-11 06:13:12.945 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.190 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433593.1903505, b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.192 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] VM Started (Lifecycle Event)#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.221 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.227 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433593.1905622, b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.228 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.256 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.262 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.285 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:13:13 np0005554845 podman[220301]: 2025-12-11 06:13:13.371053229 +0000 UTC m=+0.062943912 container create 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:13:13 np0005554845 systemd[1]: Started libpod-conmon-8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973.scope.
Dec 11 01:13:13 np0005554845 podman[220301]: 2025-12-11 06:13:13.335877932 +0000 UTC m=+0.027768635 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:13:13 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:13:13 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/5a4bc3cc57046bdd94243802b2b2cddb8eadb6335b7b931d469a93325abf24d7/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:13:13 np0005554845 podman[220301]: 2025-12-11 06:13:13.464043087 +0000 UTC m=+0.155933790 container init 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:13:13 np0005554845 podman[220301]: 2025-12-11 06:13:13.470084943 +0000 UTC m=+0.161975616 container start 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, io.buildah.version=1.41.3)
Dec 11 01:13:13 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [NOTICE]   (220320) : New worker (220322) forked
Dec 11 01:13:13 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [NOTICE]   (220320) : Loading success.
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.551 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 in datapath d1d0291b-2b4e-477b-a989-16bcd5f034d4 unbound from our chassis#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.554 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d1d0291b-2b4e-477b-a989-16bcd5f034d4#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.565 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f8e0a3f5-eb5d-47c6-b522-702332c6bb95]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.567 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd1d0291b-21 in ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.569 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd1d0291b-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.569 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[072a3d87-b150-41eb-abb7-648b9ab10078]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.569 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d404723c-fbd1-4d8c-b7fe-63bdc74059c3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.580 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[df3900bd-f219-4d37-a776-7b343de9ee8a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.604 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d96a5dcd-3a6f-45c0-a254-acaeffa9c9a0]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.635 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[984e9baf-0be9-4f89-ae74-4d3e57bbde08]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.641 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ac001f70-eb5b-4164-beb8-3ccd0aa35ca7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 NetworkManager[55529]: <info>  [1765433593.6422] manager: (tapd1d0291b-20): new Veth device (/org/freedesktop/NetworkManager/Devices/108)
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.674 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[32c6829f-ed53-49b1-9dd0-ff6861ef59a3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.678 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[7abe7b9e-f69e-462e-9505-8e8c783b2f83]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 NetworkManager[55529]: <info>  [1765433593.7027] device (tapd1d0291b-20): carrier: link connected
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.705 187132 DEBUG nova.compute.manager [req-61848c26-78b9-4265-a8f6-be3c4fd7f6b9 req-28eb5d55-7302-4b50-bcb2-4b395407e67e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.705 187132 DEBUG oslo_concurrency.lockutils [req-61848c26-78b9-4265-a8f6-be3c4fd7f6b9 req-28eb5d55-7302-4b50-bcb2-4b395407e67e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.706 187132 DEBUG oslo_concurrency.lockutils [req-61848c26-78b9-4265-a8f6-be3c4fd7f6b9 req-28eb5d55-7302-4b50-bcb2-4b395407e67e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.706 187132 DEBUG oslo_concurrency.lockutils [req-61848c26-78b9-4265-a8f6-be3c4fd7f6b9 req-28eb5d55-7302-4b50-bcb2-4b395407e67e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.706 187132 DEBUG nova.compute.manager [req-61848c26-78b9-4265-a8f6-be3c4fd7f6b9 req-28eb5d55-7302-4b50-bcb2-4b395407e67e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Processing event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.708 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[b4538738-61d4-4fb7-b443-33b29ffc8de3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.729 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bd82be91-03cc-4911-8356-e3b537f9840a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1d0291b-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:a2:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386834, 'reachable_time': 29825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220341, 'error': None, 'target': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.747 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2bc6a39c-e826-491e-bc57-34105878e487]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe6a:a2c8'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 386834, 'tstamp': 386834}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 220342, 'error': None, 'target': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.770 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[62b5e1f0-de28-4129-9990-141459bcd9a7]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd1d0291b-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:6a:a2:c8'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 63], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386834, 'reachable_time': 29825, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 220343, 'error': None, 'target': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.796 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c33288b5-74fe-4d1e-8f6f-46baae15d8e1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.833 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[58ff89cb-f8ea-46dd-86c6-a852c0fb33f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.834 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1d0291b-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.835 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.835 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd1d0291b-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:13 np0005554845 kernel: tapd1d0291b-20: entered promiscuous mode
Dec 11 01:13:13 np0005554845 NetworkManager[55529]: <info>  [1765433593.8385] manager: (tapd1d0291b-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/109)
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.840 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd1d0291b-20, col_values=(('external_ids', {'iface-id': '2646f808-9283-4885-ac09-2ea314894a12'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:13 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:13Z|00213|binding|INFO|Releasing lport 2646f808-9283-4885-ac09-2ea314894a12 from this chassis (sb_readonly=0)
Dec 11 01:13:13 np0005554845 nova_compute[187128]: 2025-12-11 06:13:13.852 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.853 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d1d0291b-2b4e-477b-a989-16bcd5f034d4.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d1d0291b-2b4e-477b-a989-16bcd5f034d4.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.854 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d8778b87-4b2e-49cb-b9da-374be49599be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.854 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-d1d0291b-2b4e-477b-a989-16bcd5f034d4
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/d1d0291b-2b4e-477b-a989-16bcd5f034d4.pid.haproxy
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID d1d0291b-2b4e-477b-a989-16bcd5f034d4
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:13:13 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:13.855 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'env', 'PROCESS_TAG=haproxy-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d1d0291b-2b4e-477b-a989-16bcd5f034d4.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:13:14 np0005554845 podman[220371]: 2025-12-11 06:13:14.221873616 +0000 UTC m=+0.050060408 container create d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:13:14 np0005554845 systemd[1]: Started libpod-conmon-d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5.scope.
Dec 11 01:13:14 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:13:14 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/f31058af8145d8311069195bf7e8b159f46b56fd22121985e47b921ad22fb479/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:13:14 np0005554845 podman[220371]: 2025-12-11 06:13:14.196686234 +0000 UTC m=+0.024873066 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:13:14 np0005554845 podman[220371]: 2025-12-11 06:13:14.299782429 +0000 UTC m=+0.127969241 container init d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:13:14 np0005554845 podman[220371]: 2025-12-11 06:13:14.306066101 +0000 UTC m=+0.134252913 container start d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.332 187132 DEBUG nova.compute.manager [req-9eb54587-19c4-465c-84cc-00dad45e8b91 req-cb83af02-79ec-4366-be81-62f472cd7c5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:14 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [NOTICE]   (220390) : New worker (220392) forked
Dec 11 01:13:14 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [NOTICE]   (220390) : Loading success.
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.333 187132 DEBUG oslo_concurrency.lockutils [req-9eb54587-19c4-465c-84cc-00dad45e8b91 req-cb83af02-79ec-4366-be81-62f472cd7c5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.334 187132 DEBUG oslo_concurrency.lockutils [req-9eb54587-19c4-465c-84cc-00dad45e8b91 req-cb83af02-79ec-4366-be81-62f472cd7c5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.334 187132 DEBUG oslo_concurrency.lockutils [req-9eb54587-19c4-465c-84cc-00dad45e8b91 req-cb83af02-79ec-4366-be81-62f472cd7c5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.335 187132 DEBUG nova.compute.manager [req-9eb54587-19c4-465c-84cc-00dad45e8b91 req-cb83af02-79ec-4366-be81-62f472cd7c5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Processing event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.336 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.341 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433594.3407583, b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.341 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.344 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.348 187132 INFO nova.virt.libvirt.driver [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance spawned successfully.#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.349 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.373 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.380 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.387 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.388 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.390 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.391 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.392 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.393 187132 DEBUG nova.virt.libvirt.driver [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.404 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.471 187132 INFO nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Took 15.51 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.472 187132 DEBUG nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.572 187132 INFO nova.compute.manager [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Took 16.08 seconds to build instance.#033[00m
Dec 11 01:13:14 np0005554845 nova_compute[187128]: 2025-12-11 06:13:14.608 187132 DEBUG oslo_concurrency.lockutils [None req-84ae5277-2a2f-40e2-b647-550942d3d1c5 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 16.197s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:15 np0005554845 nova_compute[187128]: 2025-12-11 06:13:15.218 187132 DEBUG nova.network.neutron [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updated VIF entry in instance network info cache for port 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:13:15 np0005554845 nova_compute[187128]: 2025-12-11 06:13:15.218 187132 DEBUG nova.network.neutron [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:15 np0005554845 nova_compute[187128]: 2025-12-11 06:13:15.268 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:15 np0005554845 nova_compute[187128]: 2025-12-11 06:13:15.309 187132 DEBUG oslo_concurrency.lockutils [req-14adbf24-9e86-4b58-9a5c-bb12309050e6 req-58b0ff3f-b97f-4e1e-929f-fdd1c9209e4b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:15 np0005554845 nova_compute[187128]: 2025-12-11 06:13:15.419 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.070 187132 DEBUG nova.compute.manager [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.071 187132 DEBUG oslo_concurrency.lockutils [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.071 187132 DEBUG oslo_concurrency.lockutils [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.072 187132 DEBUG oslo_concurrency.lockutils [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.072 187132 DEBUG nova.compute.manager [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.072 187132 WARNING nova.compute.manager [req-2762001f-db48-466e-91b3-b0fe116d296f req-d1bc3b9b-2dc0-4f5b-bb2f-ce9cd0e085bd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received unexpected event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.535 187132 DEBUG nova.compute.manager [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.535 187132 DEBUG oslo_concurrency.lockutils [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.536 187132 DEBUG oslo_concurrency.lockutils [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.536 187132 DEBUG oslo_concurrency.lockutils [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.537 187132 DEBUG nova.compute.manager [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.537 187132 WARNING nova.compute.manager [req-a49e8b78-2ec3-40e0-bf86-f625863b1fac req-30fbfb76-6ef3-4b95-86e8-67d5037a8a22 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received unexpected event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:13:16 np0005554845 nova_compute[187128]: 2025-12-11 06:13:16.626 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:17 np0005554845 podman[220401]: 2025-12-11 06:13:17.149358389 +0000 UTC m=+0.067210020 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:13:18 np0005554845 nova_compute[187128]: 2025-12-11 06:13:18.634 187132 DEBUG nova.compute.manager [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-changed-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:18 np0005554845 nova_compute[187128]: 2025-12-11 06:13:18.635 187132 DEBUG nova.compute.manager [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing instance network info cache due to event network-changed-4afc532b-f213-41cf-9252-65894783ee04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:13:18 np0005554845 nova_compute[187128]: 2025-12-11 06:13:18.635 187132 DEBUG oslo_concurrency.lockutils [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:18 np0005554845 nova_compute[187128]: 2025-12-11 06:13:18.635 187132 DEBUG oslo_concurrency.lockutils [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:18 np0005554845 nova_compute[187128]: 2025-12-11 06:13:18.635 187132 DEBUG nova.network.neutron [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing network info cache for port 4afc532b-f213-41cf-9252-65894783ee04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:13:19 np0005554845 nova_compute[187128]: 2025-12-11 06:13:19.800 187132 DEBUG nova.network.neutron [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updated VIF entry in instance network info cache for port 4afc532b-f213-41cf-9252-65894783ee04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:13:19 np0005554845 nova_compute[187128]: 2025-12-11 06:13:19.800 187132 DEBUG nova.network.neutron [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:19 np0005554845 nova_compute[187128]: 2025-12-11 06:13:19.822 187132 DEBUG oslo_concurrency.lockutils [req-5a28fb33-05a6-4ca0-ad49-f98d807ec6cb req-07898893-008b-4a33-ba9e-aae68f7bb6d6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:20 np0005554845 nova_compute[187128]: 2025-12-11 06:13:20.421 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:21 np0005554845 nova_compute[187128]: 2025-12-11 06:13:21.684 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:23 np0005554845 podman[220426]: 2025-12-11 06:13:23.12814694 +0000 UTC m=+0.060423183 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:13:25 np0005554845 nova_compute[187128]: 2025-12-11 06:13:25.465 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:25 np0005554845 nova_compute[187128]: 2025-12-11 06:13:25.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:25 np0005554845 nova_compute[187128]: 2025-12-11 06:13:25.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:13:25 np0005554845 nova_compute[187128]: 2025-12-11 06:13:25.710 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:13:26 np0005554845 podman[220468]: 2025-12-11 06:13:26.115101198 +0000 UTC m=+0.049660977 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:13:26 np0005554845 podman[220469]: 2025-12-11 06:13:26.138864001 +0000 UTC m=+0.072155545 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller)
Dec 11 01:13:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:26.225 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:26.226 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:26.227 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:26 np0005554845 nova_compute[187128]: 2025-12-11 06:13:26.702 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:27Z|00028|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:af:0b:4a 10.100.0.10
Dec 11 01:13:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:27Z|00029|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:af:0b:4a 10.100.0.10
Dec 11 01:13:27 np0005554845 nova_compute[187128]: 2025-12-11 06:13:27.710 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:27 np0005554845 nova_compute[187128]: 2025-12-11 06:13:27.710 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:13:27 np0005554845 nova_compute[187128]: 2025-12-11 06:13:27.710 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:13:28 np0005554845 podman[220511]: 2025-12-11 06:13:28.135487207 +0000 UTC m=+0.065219474 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:13:28 np0005554845 nova_compute[187128]: 2025-12-11 06:13:28.711 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:28 np0005554845 nova_compute[187128]: 2025-12-11 06:13:28.711 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:28 np0005554845 nova_compute[187128]: 2025-12-11 06:13:28.712 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:13:28 np0005554845 nova_compute[187128]: 2025-12-11 06:13:28.712 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.100 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'name': 'tempest-TestGettingAddress-server-1686231499', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000001f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'user_id': '60e9372de4754580913a836e11b9c248', 'hostId': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.101 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.101 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.101 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>]
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.104 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 / tap4afc532b-f2 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.105 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 / tap69f4d5c0-8f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.105 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.105 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1054f503-c32c-4291-983e-efa353380d69', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.101903', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '838dc198-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '27f50ffa7d458e02ccc6d15d8ea69e5aa5e4f343452a326699cf6f1ece07dc9d'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.101903', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '838dcf6c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '2e190b52ffe2713a5c85c35d346cae522f8d22bda4313867085baec1942aca0e'}]}, 'timestamp': '2025-12-11 06:13:30.106206', '_unique_id': 'f39b35c8295b49a4a381ac858b89b427'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.107 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.108 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.108 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>]
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.108 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.118 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.usage volume: 29753344 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.118 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8f3d01c-afcc-445f-a406-234dfc759ed6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29753344, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.108728', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '838fb002-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': '3da1836b4ea128ddf28919145840414d7afa43791ca0dae2f4dc580c3598339f'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.108728', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '838fbbf6-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': '404f08be9db89a264156c87b851ae0f7cdb80de36faabd869304c8cf685fe102'}]}, 'timestamp': '2025-12-11 06:13:30.118811', '_unique_id': 'a851279913714126af70f49f64d79110'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.119 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.120 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.120 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>]
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.120 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.121 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.bytes volume: 1648 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.121 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.bytes volume: 772 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b96ce410-e911-4729-a87a-c98fb877178c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1648, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.121002', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '83901e70-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'b91857eadfcb6f9fee3a6aaf9ca333526a5f163de2d0b4fe6126534e8bbafa46'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 772, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.121002', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '83902b4a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'a2026b31f0f7ad55e52fe9fd5ae300701d774a3403c4a1d452f2389e22edea9a'}]}, 'timestamp': '2025-12-11 06:13:30.121661', '_unique_id': 'b05264a341204575befe9aecdb496975'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.123 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.147 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.requests volume: 1135 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.147 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1eee6637-bf06-4e6a-8d5a-6b05e179e83f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1135, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.123220', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83942592-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '4d77ed89b80d79d8de9275df81e90cbeb83fa0471e0fe7316e7f127ed218bddc'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.123220', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83942f10-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '196be65b4ad60c283269f4cf879abf0fa5883e8808dfacfbc96e43c7e70693ec'}]}, 'timestamp': '2025-12-11 06:13:30.147919', '_unique_id': '224c5efebdc94a5ca65b11828077ec2c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.148 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.149 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.149 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.bytes volume: 31017472 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.149 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bb55b204-88d2-4f8d-96fa-31dad2eba7b9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 31017472, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.149298', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83947024-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '9c535de275227c59855ea053c69dd7e301a5a508739c2f19e813638c757bf212'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.149298', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8394788a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '1b82a57fcd686aae7a86eeaf338de829783b4c05a6468d330301ee1f76fe7c90'}]}, 'timestamp': '2025-12-11 06:13:30.149793', '_unique_id': '23c38e481c0a45d79827aa7980a36238'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.150 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7b6e948a-2cb6-4261-8271-a8df3840e430', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.150922', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8394ad1e-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'eda7ed4b6f318b0ae920514fe46a60f364dea888a75e02bbaf41ef053c11e740'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.150922', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '8394b57a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '29d2adf52eb480145c80e77377bbfaec6517e99118270ffb2db2d3b8958e74e0'}]}, 'timestamp': '2025-12-11 06:13:30.151408', '_unique_id': 'e6ba0af784dd49f3b626d31a47f1c6fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.151 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.152 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.152 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.bytes volume: 1326 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.152 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.bytes volume: 1382 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '11aaed6c-f35d-4449-9909-7dc8e58c3914', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1326, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.152617', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8394ef36-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'f8af602912b8c05b59ef1ec9782fe4f390d23f9a17c5584b8f37064d60f1c565'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 1382, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.152617', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '8394f9ea-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '49f241c12b5088a1356fc9ff62f25ab5fdc50ec398d00046a5626d401d347d21'}]}, 'timestamp': '2025-12-11 06:13:30.153114', '_unique_id': '2cf7de8d05334f848f8ff0640994514e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.153 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.154 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.154 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.requests volume: 282 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.154 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24131678-7bc2-4a8c-916d-8e63d244645e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 282, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.154255', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83952f32-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': 'ad342de079a607deeb42c8356683eb238d90f5d5a7384cec909d951c96a8ef7e'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.154255', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '839537f2-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': 'c23b8bf58d5f57c4c655fd70d0dae97d778c8ca5a0e5ab3e63a05751ff04013c'}]}, 'timestamp': '2025-12-11 06:13:30.154692', '_unique_id': '3912f759ce4b4523bda2256f80b53b76'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.155 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.latency volume: 4604581328 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8da8cb31-6b53-42cd-aa58-db64d8c5012d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4604581328, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.155875', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83956e5c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '6588cdbd7ac64b6f9f07af5093f02d30b28839a3e67dfd0cfcb7b9fab523d1b9'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.155875', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '83957cee-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': 'b826bd83fb1f4a2f08d93fb3e56ad46f8d151aa3c1ca14e12f334022c29a688e'}]}, 'timestamp': '2025-12-11 06:13:30.156490', '_unique_id': '70712965b8524cb58401fd5b0a17f6e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.157 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.157 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.157 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fade733-4e58-4a84-b26a-58c77a2cea56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.157625', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8395b2cc-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '2f7a092d81c8bf8cfb8f395b3c97b845a7f23eb9178f1551f68816cb47b5852d'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.157625', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '8395bb14-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '85df079d8ef42a24614fe7e9db47cf1aaeeca294930a3add1f9b6e40555e493b'}]}, 'timestamp': '2025-12-11 06:13:30.158056', '_unique_id': '164a93f6a02c41bc8ebd54b31d22edee'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.158 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.159 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.159 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.159 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets volume: 8 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1dff22e4-acf2-4851-bfb8-23936acb15a2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.159166', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8395eeea-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '03bb942816ca6c08fbab05cd7b5c4451a15d027d2b25e1c68825fb36aafbdd62'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 8, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.159166', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '8395f930-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'a211455e74aa9dc617418c1e8ef785b4a31b5cd9d1311d37d867e11c155af379'}]}, 'timestamp': '2025-12-11 06:13:30.159654', '_unique_id': '6af7b4e68a6b491bba13e173606a5d34'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.160 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'befb7ff6-8935-419b-bf68-3b9cdbe65348', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.160879', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '839631f2-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': 'a1da39874e61db7d84142145ea49da50f606339adb01d16b8c2935bbd55336bd'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.160879', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '839639d6-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': 'bd1ab3d4ebfd4ff7b881640c50488e97f3e6ce907f9d76d54381bd12d0d9137d'}]}, 'timestamp': '2025-12-11 06:13:30.161291', '_unique_id': '8eae21fe66334b9e8af25475bb8de963'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1686231499>]
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets volume: 11 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.162 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.packets volume: 13 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '22d14b5f-7e53-4ada-b524-57cafb6eef1c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 11, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.162641', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8396769e-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '596f5125297d96d201d4780ff14344e7971055ecabeedd1b2eebc2c080303340'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 13, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.162641', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '83967eaa-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'f8c1dd6890dfc1935d65f409e865eace159d48b9be5be00fc5051447ba2e4039'}]}, 'timestamp': '2025-12-11 06:13:30.163059', '_unique_id': '478aedc29c1a403fb9b6170fdb935a36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3208b7e4-45c1-4bd4-89ce-6c25c03d399a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.164126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '8396b082-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'ecb9251ddc7a5ea8030f1aae7b124d03cf663c540ae8f9cc9a807835002a2fc6'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.164126', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '8396b924-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '48e971c05cedd1ad20e6a3e1e29b1108468d6fbe57b977a2a3f091d5ab5929ae'}]}, 'timestamp': '2025-12-11 06:13:30.164558', '_unique_id': 'c895657b62eb40189ae0c0cc77171557'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.164 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.165 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.181 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/memory.usage volume: 40.39453125 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9a1436a-8773-49ea-bbc3-1a133bcb7c44', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 40.39453125, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'timestamp': '2025-12-11T06:13:30.165596', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': '83995832-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.883006508, 'message_signature': 'c9860ac1879fd2c83d616362126ab6917af4e4e023689bee3df0ac421ab2dc98'}]}, 'timestamp': '2025-12-11 06:13:30.181772', '_unique_id': 'faaec174f9614de48381570b665ee1b5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.182 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.183 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.183 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.183 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '525efd94-1e71-4250-8f58-53fda26f06e4', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.183213', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '83999a9a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': 'fa69b4e0f493a25726a3a121fe6f596a065050982a999d8f7f26db00f5b49d42'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.183213', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '8399a378-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.810420023, 'message_signature': '7ac504cc636d3b50fc771dbcc2288d373343aafe44c7b36b294f72300f029613'}]}, 'timestamp': '2025-12-11 06:13:30.183693', '_unique_id': '5c46265b628541a29c06a1699abd0392'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.184 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/cpu volume: 11990000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4e0e8450-6af6-4944-9ef6-c166e009525d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11990000000, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'timestamp': '2025-12-11T06:13:30.185061', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '8399e392-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.883006508, 'message_signature': 'bb6fa750e6ff8eb28f1d46d76a3e819663ccc96a09de90dd7c7bd7ff1eed03ad'}]}, 'timestamp': '2025-12-11 06:13:30.185341', '_unique_id': '6d504c0cbe344469a572798c9eef9274'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.186 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.latency volume: 194167853 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.186 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.read.latency volume: 17441314 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '155609b1-f612-49a2-8ab7-ce573d222ce5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 194167853, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.186523', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '839a1bf0-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '1c2a9db590bcf47c409fb0e9680c3611ff0c57f9e3ff9d0b1e3956f15817a731'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 17441314, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.186523', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '839a237a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': 'b6f242696cbbb72e0fe1bed8a45bf4a25b0c96e4ad5ef8e30c6b1166e61967ce'}]}, 'timestamp': '2025-12-11 06:13:30.186933', '_unique_id': '1e09421184904b7fa044f1485941faed'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.187 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4cb8808b-9cf5-46eb-b60b-4eb4e4097a1a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.187992', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '839a5502-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'ad782164bb873856df63f20d408cb920f6946223abe198f406602f1593c83658'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.187992', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '839a5cdc-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '5f9defd069529b4b0d6e269ecc857717fe7aeea9ae22dbb5de64671b07db1221'}]}, 'timestamp': '2025-12-11 06:13:30.188429', '_unique_id': 'cb78a670a889496d9da552eb7af47de2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.188 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.189 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.189 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.bytes volume: 72753152 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.189 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '3b22e02f-fcdb-405d-9847-1c11021383e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72753152, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-vda', 'timestamp': '2025-12-11T06:13:30.189532', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '839a9116-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': '92183ef00079cb3e52aecb108919c006140fba8cce79205467d09ced66578b0d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-sda', 'timestamp': '2025-12-11T06:13:30.189532', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'instance-0000001f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '839a98a0-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.824923041, 'message_signature': 'd10ca089f570b1dbf5c5f315973a5b5b36caa329355f83582721036affa88a5c'}]}, 'timestamp': '2025-12-11 06:13:30.189956', '_unique_id': '7e18c0f087da408b8a46594d47c4e412'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.191 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.191 12 DEBUG ceilometer.compute.pollsters [-] b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0ae52396-a978-4310-9460-8ef994a3ba63', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap4afc532b-f2', 'timestamp': '2025-12-11T06:13:30.191034', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap4afc532b-f2', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:af:0b:4a', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4afc532b-f2'}, 'message_id': '839accf8-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': '65301798da70ac0c3d8dea0f09b13701efda6220c3f86651c5e124272fef6010'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000001f-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-tap69f4d5c0-8f', 'timestamp': '2025-12-11T06:13:30.191034', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1686231499', 'name': 'tap69f4d5c0-8f', 'instance_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:6b:89:3c', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap69f4d5c0-8f'}, 'message_id': '839ad6d0-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 3884.803572244, 'message_signature': 'd5ae6e419442ca929405b3ce93ec8385d1eed4ebb6af204da7d7cac408487e0a'}]}, 'timestamp': '2025-12-11 06:13:30.191535', '_unique_id': 'ef1360b13f7a46c8b9fa998994d6a505'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:13:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:13:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:13:30 np0005554845 nova_compute[187128]: 2025-12-11 06:13:30.469 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:31 np0005554845 nova_compute[187128]: 2025-12-11 06:13:31.709 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:32 np0005554845 podman[220534]: 2025-12-11 06:13:32.1207889 +0000 UTC m=+0.051314873 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.33.7, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, version=9.6, managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, config_id=edpm, container_name=openstack_network_exporter, distribution-scope=public, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 01:13:32 np0005554845 podman[220533]: 2025-12-11 06:13:32.127084462 +0000 UTC m=+0.056425672 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.598 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.620 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.620 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.621 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.621 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.621 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.621 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.622 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.622 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.642 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.642 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.642 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.643 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.703 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.761 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.762 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.815 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.983 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.984 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5500MB free_disk=73.26372146606445GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.985 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:32 np0005554845 nova_compute[187128]: 2025-12-11 06:13:32.985 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.096 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 actively managed on this compute host and has allocations in placement: {'resources': {'VCPU': 1, 'MEMORY_MB': 128, 'DISK_GB': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.097 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.097 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.192 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.207 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.230 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:13:33 np0005554845 nova_compute[187128]: 2025-12-11 06:13:33.231 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.246s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:35 np0005554845 nova_compute[187128]: 2025-12-11 06:13:35.208 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:35 np0005554845 nova_compute[187128]: 2025-12-11 06:13:35.209 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:35 np0005554845 nova_compute[187128]: 2025-12-11 06:13:35.472 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:35 np0005554845 nova_compute[187128]: 2025-12-11 06:13:35.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:36 np0005554845 nova_compute[187128]: 2025-12-11 06:13:36.713 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.194 187132 DEBUG nova.compute.manager [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-changed-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.194 187132 DEBUG nova.compute.manager [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing instance network info cache due to event network-changed-4afc532b-f213-41cf-9252-65894783ee04. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.194 187132 DEBUG oslo_concurrency.lockutils [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.195 187132 DEBUG oslo_concurrency.lockutils [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.195 187132 DEBUG nova.network.neutron [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Refreshing network info cache for port 4afc532b-f213-41cf-9252-65894783ee04 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.260 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.260 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.261 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.261 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.261 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.263 187132 INFO nova.compute.manager [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Terminating instance#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.264 187132 DEBUG nova.compute.manager [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:13:37 np0005554845 kernel: tap4afc532b-f2 (unregistering): left promiscuous mode
Dec 11 01:13:37 np0005554845 NetworkManager[55529]: <info>  [1765433617.2868] device (tap4afc532b-f2): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00214|binding|INFO|Releasing lport 4afc532b-f213-41cf-9252-65894783ee04 from this chassis (sb_readonly=0)
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00215|binding|INFO|Setting lport 4afc532b-f213-41cf-9252-65894783ee04 down in Southbound
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.297 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00216|binding|INFO|Removing iface tap4afc532b-f2 ovn-installed in OVS
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.301 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 kernel: tap69f4d5c0-8f (unregistering): left promiscuous mode
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.307 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:af:0b:4a 10.100.0.10'], port_security=['fa:16:3e:af:0b:4a 10.100.0.10'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28', 'neutron:device_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-63692175-a3b4-4228-86f4-602a703ce14b', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '68d01dd1-67cf-4b05-b3d1-1764b2624dfa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2a3d0efe-c96c-491b-ac4c-78adea640873, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4afc532b-f213-41cf-9252-65894783ee04) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.310 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4afc532b-f213-41cf-9252-65894783ee04 in datapath 63692175-a3b4-4228-86f4-602a703ce14b unbound from our chassis#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.313 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 63692175-a3b4-4228-86f4-602a703ce14b, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.315 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7ac3cdb7-e5c9-4636-b8b3-b5d726074a95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.316 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b namespace which is not needed anymore#033[00m
Dec 11 01:13:37 np0005554845 NetworkManager[55529]: <info>  [1765433617.3182] device (tap69f4d5c0-8f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.320 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.329 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00217|binding|INFO|Releasing lport 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 from this chassis (sb_readonly=0)
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00218|binding|INFO|Setting lport 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 down in Southbound
Dec 11 01:13:37 np0005554845 ovn_controller[95428]: 2025-12-11T06:13:37Z|00219|binding|INFO|Removing iface tap69f4d5c0-8f ovn-installed in OVS
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.331 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.336 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:6b:89:3c 2001:db8:0:1:f816:3eff:fe6b:893c 2001:db8::f816:3eff:fe6b:893c'], port_security=['fa:16:3e:6b:89:3c 2001:db8:0:1:f816:3eff:fe6b:893c 2001:db8::f816:3eff:fe6b:893c'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8:0:1:f816:3eff:fe6b:893c/64 2001:db8::f816:3eff:fe6b:893c/64', 'neutron:device_id': 'b7fcd131-1c40-4ddc-9d8a-6a9b503cb773', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '68d01dd1-67cf-4b05-b3d1-1764b2624dfa', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2aa68e22-5010-4f13-b0a0-aaa483f0ac60, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.355 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Deactivated successfully.
Dec 11 01:13:37 np0005554845 systemd[1]: machine-qemu\x2d15\x2dinstance\x2d0000001f.scope: Consumed 14.028s CPU time.
Dec 11 01:13:37 np0005554845 systemd-machined[153381]: Machine qemu-15-instance-0000001f terminated.
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [NOTICE]   (220320) : haproxy version is 2.8.14-c23fe91
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [NOTICE]   (220320) : path to executable is /usr/sbin/haproxy
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [WARNING]  (220320) : Exiting Master process...
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [ALERT]    (220320) : Current worker (220322) exited with code 143 (Terminated)
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b[220316]: [WARNING]  (220320) : All workers exited. Exiting... (0)
Dec 11 01:13:37 np0005554845 systemd[1]: libpod-8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973.scope: Deactivated successfully.
Dec 11 01:13:37 np0005554845 podman[220614]: 2025-12-11 06:13:37.472421275 +0000 UTC m=+0.049490153 container died 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 01:13:37 np0005554845 NetworkManager[55529]: <info>  [1765433617.4915] manager: (tap69f4d5c0-8f): new Tun device (/org/freedesktop/NetworkManager/Devices/110)
Dec 11 01:13:37 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973-userdata-shm.mount: Deactivated successfully.
Dec 11 01:13:37 np0005554845 systemd[1]: var-lib-containers-storage-overlay-5a4bc3cc57046bdd94243802b2b2cddb8eadb6335b7b931d469a93325abf24d7-merged.mount: Deactivated successfully.
Dec 11 01:13:37 np0005554845 podman[220614]: 2025-12-11 06:13:37.524048055 +0000 UTC m=+0.101116923 container cleanup 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.528 187132 INFO nova.virt.libvirt.driver [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Instance destroyed successfully.#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.528 187132 DEBUG nova.objects.instance [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.532 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.532 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=12, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=11) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:13:37 np0005554845 systemd[1]: libpod-conmon-8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973.scope: Deactivated successfully.
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.571 187132 DEBUG nova.virt.libvirt.vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:13:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:13:14Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.571 187132 DEBUG nova.network.os_vif_util [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.177", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.572 187132 DEBUG nova.network.os_vif_util [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.573 187132 DEBUG os_vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.575 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.575 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4afc532b-f2, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.576 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.579 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.581 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.584 187132 INFO os_vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:af:0b:4a,bridge_name='br-int',has_traffic_filtering=True,id=4afc532b-f213-41cf-9252-65894783ee04,network=Network(63692175-a3b4-4228-86f4-602a703ce14b),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4afc532b-f2')#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.584 187132 DEBUG nova.virt.libvirt.vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:12:57Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1686231499',display_name='tempest-TestGettingAddress-server-1686231499',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1686231499',id=31,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCR4mP1++4ml1efuLG9dj+GrIncgFZYen/iWp4xYfQhgvW9R/EgaRjQN1FHv7kPm7pJxVLhyVeP2wU5TC12FbnpGY+FKpeNWgR+UrlKQdDnQLuiET9srsqRIG5uxBcynUQ==',key_name='tempest-TestGettingAddress-763091187',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:13:14Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-jgoeblqf',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:13:14Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=b7fcd131-1c40-4ddc-9d8a-6a9b503cb773,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.585 187132 DEBUG nova.network.os_vif_util [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.586 187132 DEBUG nova.network.os_vif_util [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.586 187132 DEBUG os_vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.587 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.588 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap69f4d5c0-8f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.589 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.591 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.593 187132 INFO os_vif [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:6b:89:3c,bridge_name='br-int',has_traffic_filtering=True,id=69f4d5c0-8f90-4321-8f66-92eb4d8d49b3,network=Network(d1d0291b-2b4e-477b-a989-16bcd5f034d4),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap69f4d5c0-8f')#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.594 187132 INFO nova.virt.libvirt.driver [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Deleting instance files /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773_del#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.594 187132 INFO nova.virt.libvirt.driver [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Deletion of /var/lib/nova/instances/b7fcd131-1c40-4ddc-9d8a-6a9b503cb773_del complete#033[00m
Dec 11 01:13:37 np0005554845 podman[220671]: 2025-12-11 06:13:37.595611702 +0000 UTC m=+0.044943796 container remove 8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.600 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[82cafe06-c2fb-48c4-a28b-232d77321b7b]: (4, ('Thu Dec 11 06:13:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b (8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973)\n8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973\nThu Dec 11 06:13:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b (8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973)\n8c5ca61401e1f654d51895564c9fb2cdd703a06a66573134b4f60667e501e973\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.601 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1d085242-621a-4816-9863-7cc3a7ad23e0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.602 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63692175-a0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:37 np0005554845 kernel: tap63692175-a0: left promiscuous mode
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.604 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.615 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.620 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[11989f29-16e9-4d19-bb13-a4ddfb5e7355]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.634 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1f001712-2334-4e2c-b8d4-78846545a7b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.636 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3715f10e-f79b-4d10-b904-8720b0580cfb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.650 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5b3ff8c2-2325-40b2-a1d7-20eeecc3ecb3]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386736, 'reachable_time': 28376, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220686, 'error': None, 'target': 'ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 systemd[1]: run-netns-ovnmeta\x2d63692175\x2da3b4\x2d4228\x2d86f4\x2d602a703ce14b.mount: Deactivated successfully.
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.653 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-63692175-a3b4-4228-86f4-602a703ce14b deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.655 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[536c47c9-a0d5-463c-a59c-2aa141505c2e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.656 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 in datapath d1d0291b-2b4e-477b-a989-16bcd5f034d4 unbound from our chassis#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.657 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d1d0291b-2b4e-477b-a989-16bcd5f034d4, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.658 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[12eab23e-aa8d-47c0-aff3-8a034568928f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.658 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4 namespace which is not needed anymore#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.659 187132 INFO nova.compute.manager [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Took 0.40 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.660 187132 DEBUG oslo.service.loopingcall [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.660 187132 DEBUG nova.compute.manager [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.660 187132 DEBUG nova.network.neutron [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [NOTICE]   (220390) : haproxy version is 2.8.14-c23fe91
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [NOTICE]   (220390) : path to executable is /usr/sbin/haproxy
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [WARNING]  (220390) : Exiting Master process...
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [WARNING]  (220390) : Exiting Master process...
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [ALERT]    (220390) : Current worker (220392) exited with code 143 (Terminated)
Dec 11 01:13:37 np0005554845 neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4[220386]: [WARNING]  (220390) : All workers exited. Exiting... (0)
Dec 11 01:13:37 np0005554845 systemd[1]: libpod-d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5.scope: Deactivated successfully.
Dec 11 01:13:37 np0005554845 podman[220704]: 2025-12-11 06:13:37.823858089 +0000 UTC m=+0.052151846 container died d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:13:37 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5-userdata-shm.mount: Deactivated successfully.
Dec 11 01:13:37 np0005554845 systemd[1]: var-lib-containers-storage-overlay-f31058af8145d8311069195bf7e8b159f46b56fd22121985e47b921ad22fb479-merged.mount: Deactivated successfully.
Dec 11 01:13:37 np0005554845 podman[220704]: 2025-12-11 06:13:37.861823563 +0000 UTC m=+0.090117280 container cleanup d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:13:37 np0005554845 systemd[1]: libpod-conmon-d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5.scope: Deactivated successfully.
Dec 11 01:13:37 np0005554845 podman[220735]: 2025-12-11 06:13:37.921571886 +0000 UTC m=+0.042215533 container remove d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.schema-version=1.0)
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.926 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d8eede20-0028-42bf-ac51-985cf8b55530]: (4, ('Thu Dec 11 06:13:37 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4 (d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5)\nd7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5\nThu Dec 11 06:13:37 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4 (d7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5)\nd7590df852db34158d045430fa7dafa565e3c24d5f6990a60e32fa3bb4ac42c5\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.928 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5d731c7c-81c9-491f-b451-dc31b40e6874]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.929 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd1d0291b-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.973 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 kernel: tapd1d0291b-20: left promiscuous mode
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.977 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:37.982 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[87257b24-56a8-406d-a05f-3d9d60984859]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:37 np0005554845 nova_compute[187128]: 2025-12-11 06:13:37.991 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.004 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e1e03fd0-4388-473b-8874-558deb363ae7]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.006 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[83c36892-0ae9-46a8-88ae-8014c4157e7f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.025 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bb403715-30dc-4181-9528-ac0ea9fc861d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 386827, 'reachable_time': 33517, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 220750, 'error': None, 'target': 'ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.027 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d1d0291b-2b4e-477b-a989-16bcd5f034d4 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.027 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[3bb91407-cd5c-4cb5-b498-2458346f94cc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:13:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:38.028 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:13:38 np0005554845 systemd[1]: run-netns-ovnmeta\x2dd1d0291b\x2d2b4e\x2d477b\x2da989\x2d16bcd5f034d4.mount: Deactivated successfully.
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.035 187132 DEBUG nova.network.neutron [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updated VIF entry in instance network info cache for port 4afc532b-f213-41cf-9252-65894783ee04. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.036 187132 DEBUG nova.network.neutron [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [{"id": "4afc532b-f213-41cf-9252-65894783ee04", "address": "fa:16:3e:af:0b:4a", "network": {"id": "63692175-a3b4-4228-86f4-602a703ce14b", "bridge": "br-int", "label": "tempest-network-smoke--1079525654", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4afc532b-f2", "ovs_interfaceid": "4afc532b-f213-41cf-9252-65894783ee04", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "address": "fa:16:3e:6b:89:3c", "network": {"id": "d1d0291b-2b4e-477b-a989-16bcd5f034d4", "bridge": "br-int", "label": "tempest-network-smoke--1099169692", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe6b:893c", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap69f4d5c0-8f", "ovs_interfaceid": "69f4d5c0-8f90-4321-8f66-92eb4d8d49b3", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.064 187132 DEBUG oslo_concurrency.lockutils [req-c4ba0aa1-a3b0-4951-965a-ed4d14bdd38d req-b69a7a32-7dbc-4944-9197-d48e700ee76f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.268 187132 DEBUG nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-unplugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.268 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.269 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.269 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.269 187132 DEBUG nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-unplugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.270 187132 DEBUG nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-unplugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.270 187132 DEBUG nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.271 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.271 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.271 187132 DEBUG oslo_concurrency.lockutils [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.272 187132 DEBUG nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.272 187132 WARNING nova.compute.manager [req-788a99fe-983a-4f67-bcf2-5adeb919b15f req-b3e8d721-e9af-4093-b298-c6bab1ee0e10 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received unexpected event network-vif-plugged-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.565 187132 DEBUG nova.network.neutron [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.583 187132 INFO nova.compute.manager [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Took 1.92 seconds to deallocate network for instance.#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.625 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.626 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.678 187132 DEBUG nova.compute.provider_tree [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.693 187132 DEBUG nova.scheduler.client.report [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.716 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.090s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.743 187132 INFO nova.scheduler.client.report [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance b7fcd131-1c40-4ddc-9d8a-6a9b503cb773#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.747 187132 DEBUG nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-unplugged-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.747 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.747 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.748 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.748 187132 DEBUG nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-unplugged-4afc532b-f213-41cf-9252-65894783ee04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.748 187132 WARNING nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received unexpected event network-vif-unplugged-4afc532b-f213-41cf-9252-65894783ee04 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.748 187132 DEBUG nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.748 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.749 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.749 187132 DEBUG oslo_concurrency.lockutils [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.749 187132 DEBUG nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] No waiting events found dispatching network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.749 187132 WARNING nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received unexpected event network-vif-plugged-4afc532b-f213-41cf-9252-65894783ee04 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.749 187132 DEBUG nova.compute.manager [req-334c2a54-7494-47bf-8387-be610bebd322 req-98f9d520-02cf-47dd-bac2-5f5e86ebb8df eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-deleted-4afc532b-f213-41cf-9252-65894783ee04 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:39 np0005554845 nova_compute[187128]: 2025-12-11 06:13:39.805 187132 DEBUG oslo_concurrency.lockutils [None req-dc2e4df6-d99e-4488-89b7-64547e9f2356 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "b7fcd131-1c40-4ddc-9d8a-6a9b503cb773" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.545s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:13:40 np0005554845 nova_compute[187128]: 2025-12-11 06:13:40.509 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:40 np0005554845 nova_compute[187128]: 2025-12-11 06:13:40.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:13:41 np0005554845 nova_compute[187128]: 2025-12-11 06:13:41.845 187132 DEBUG nova.compute.manager [req-8326e932-6375-42db-bb85-4f6b2698e8ce req-77863f33-f036-4275-a6e9-5d623e9111e0 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Received event network-vif-deleted-69f4d5c0-8f90-4321-8f66-92eb4d8d49b3 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:13:42 np0005554845 nova_compute[187128]: 2025-12-11 06:13:42.590 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:45 np0005554845 nova_compute[187128]: 2025-12-11 06:13:45.510 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:13:46.031 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '12'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:13:47 np0005554845 nova_compute[187128]: 2025-12-11 06:13:47.594 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:48 np0005554845 podman[220751]: 2025-12-11 06:13:48.12835 +0000 UTC m=+0.056593702 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:13:50 np0005554845 nova_compute[187128]: 2025-12-11 06:13:50.511 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:52 np0005554845 nova_compute[187128]: 2025-12-11 06:13:52.526 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433617.5250897, b7fcd131-1c40-4ddc-9d8a-6a9b503cb773 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:13:52 np0005554845 nova_compute[187128]: 2025-12-11 06:13:52.527 187132 INFO nova.compute.manager [-] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:13:52 np0005554845 nova_compute[187128]: 2025-12-11 06:13:52.545 187132 DEBUG nova.compute.manager [None req-3e7d94be-853c-4288-af12-a9b0c42d730a - - - - - -] [instance: b7fcd131-1c40-4ddc-9d8a-6a9b503cb773] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:13:52 np0005554845 nova_compute[187128]: 2025-12-11 06:13:52.597 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:53 np0005554845 nova_compute[187128]: 2025-12-11 06:13:53.760 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:53 np0005554845 nova_compute[187128]: 2025-12-11 06:13:53.937 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:54 np0005554845 podman[220776]: 2025-12-11 06:13:54.115030135 +0000 UTC m=+0.050747794 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=edpm, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute)
Dec 11 01:13:55 np0005554845 nova_compute[187128]: 2025-12-11 06:13:55.513 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:57 np0005554845 podman[220796]: 2025-12-11 06:13:57.156482798 +0000 UTC m=+0.079587740 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:13:57 np0005554845 podman[220797]: 2025-12-11 06:13:57.187919815 +0000 UTC m=+0.114694896 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3)
Dec 11 01:13:57 np0005554845 nova_compute[187128]: 2025-12-11 06:13:57.599 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:13:59 np0005554845 podman[220840]: 2025-12-11 06:13:59.184141808 +0000 UTC m=+0.107952703 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, container_name=multipathd)
Dec 11 01:14:00 np0005554845 nova_compute[187128]: 2025-12-11 06:14:00.514 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:02 np0005554845 nova_compute[187128]: 2025-12-11 06:14:02.660 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:03 np0005554845 podman[220861]: 2025-12-11 06:14:03.12763229 +0000 UTC m=+0.056567242 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:14:03 np0005554845 podman[220862]: 2025-12-11 06:14:03.13719267 +0000 UTC m=+0.061126627 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.component=ubi9-minimal-container, vcs-type=git, container_name=openstack_network_exporter, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., io.buildah.version=1.33.7, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., io.openshift.expose-services=, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']})
Dec 11 01:14:05 np0005554845 nova_compute[187128]: 2025-12-11 06:14:05.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:07 np0005554845 nova_compute[187128]: 2025-12-11 06:14:07.663 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:10 np0005554845 nova_compute[187128]: 2025-12-11 06:14:10.518 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:12 np0005554845 nova_compute[187128]: 2025-12-11 06:14:12.665 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:15 np0005554845 nova_compute[187128]: 2025-12-11 06:14:15.522 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:17 np0005554845 nova_compute[187128]: 2025-12-11 06:14:17.668 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:19 np0005554845 podman[220907]: 2025-12-11 06:14:19.141296249 +0000 UTC m=+0.076700651 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:14:20 np0005554845 nova_compute[187128]: 2025-12-11 06:14:20.524 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:22 np0005554845 nova_compute[187128]: 2025-12-11 06:14:22.671 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:25 np0005554845 podman[220932]: 2025-12-11 06:14:25.124606642 +0000 UTC m=+0.059539363 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 01:14:25 np0005554845 nova_compute[187128]: 2025-12-11 06:14:25.525 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:26.226 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:26.227 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:26.227 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:27 np0005554845 nova_compute[187128]: 2025-12-11 06:14:27.673 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:28 np0005554845 podman[220954]: 2025-12-11 06:14:28.127749202 +0000 UTC m=+0.058406562 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 11 01:14:28 np0005554845 podman[220955]: 2025-12-11 06:14:28.14565014 +0000 UTC m=+0.075610671 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.739 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.739 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.740 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.740 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.922 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.923 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5724MB free_disk=73.29245376586914GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.923 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:28 np0005554845 nova_compute[187128]: 2025-12-11 06:14:28.924 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.031 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.032 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.054 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.070 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.105 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.105 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.182s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.888 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.889 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:29 np0005554845 nova_compute[187128]: 2025-12-11 06:14:29.930 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.054 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.055 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.065 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.065 187132 INFO nova.compute.claims [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.090 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.090 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.090 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:14:30 np0005554845 podman[221000]: 2025-12-11 06:14:30.141435601 +0000 UTC m=+0.070486122 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:14:30 np0005554845 nova_compute[187128]: 2025-12-11 06:14:30.527 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.065 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.065 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.066 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.457 187132 DEBUG nova.compute.provider_tree [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.662 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:14:31 np0005554845 nova_compute[187128]: 2025-12-11 06:14:31.750 187132 DEBUG nova.scheduler.client.report [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.542 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 2.486s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.543 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.605 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.606 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.632 187132 INFO nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.654 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.675 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.786 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.789 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.789 187132 INFO nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Creating image(s)#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.790 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.791 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.792 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.816 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.874 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.876 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.877 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.900 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.959 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:32 np0005554845 nova_compute[187128]: 2025-12-11 06:14:32.960 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.003 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk 1073741824" returned: 0 in 0.042s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.004 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.005 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.080 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.081 187132 DEBUG nova.virt.disk.api [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Checking if we can resize image /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.082 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.144 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.145 187132 DEBUG nova.virt.disk.api [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Cannot resize image /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.146 187132 DEBUG nova.objects.instance [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'migration_context' on Instance uuid a64d006a-fa23-4538-a7c4-57160050b331 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.158 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.159 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Ensure instance console log exists: /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.159 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.160 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.160 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.322 187132 DEBUG nova.policy [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '3b482a000b3e4b5c964be05bad2a0418', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': 'fce35ab888e44e46b3108813dcdf4163', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:14:33 np0005554845 nova_compute[187128]: 2025-12-11 06:14:33.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:34 np0005554845 podman[221034]: 2025-12-11 06:14:34.116158002 +0000 UTC m=+0.048991856 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:14:34 np0005554845 podman[221035]: 2025-12-11 06:14:34.151225798 +0000 UTC m=+0.082336764 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, io.buildah.version=1.33.7, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, distribution-scope=public, io.openshift.tags=minimal rhel9, release=1755695350, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, io.openshift.expose-services=, vcs-type=git, managed_by=edpm_ansible, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:14:34 np0005554845 nova_compute[187128]: 2025-12-11 06:14:34.449 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Successfully created port: 39751729-025a-4280-89aa-883712fc8dcb _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.529 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.706 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Successfully updated port: 39751729-025a-4280-89aa-883712fc8dcb _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.739 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.740 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.740 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.824 187132 DEBUG nova.compute.manager [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-changed-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.825 187132 DEBUG nova.compute.manager [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing instance network info cache due to event network-changed-39751729-025a-4280-89aa-883712fc8dcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:14:35 np0005554845 nova_compute[187128]: 2025-12-11 06:14:35.826 187132 DEBUG oslo_concurrency.lockutils [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:14:36 np0005554845 nova_compute[187128]: 2025-12-11 06:14:36.577 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:14:36 np0005554845 nova_compute[187128]: 2025-12-11 06:14:36.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:37 np0005554845 nova_compute[187128]: 2025-12-11 06:14:37.677 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:37 np0005554845 nova_compute[187128]: 2025-12-11 06:14:37.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:14:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:37.784 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=13, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=12) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:14:37 np0005554845 nova_compute[187128]: 2025-12-11 06:14:37.784 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:37 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:37.786 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.262 187132 DEBUG nova.network.neutron [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.314 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.315 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Instance network_info: |[{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.315 187132 DEBUG oslo_concurrency.lockutils [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.315 187132 DEBUG nova.network.neutron [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing network info cache for port 39751729-025a-4280-89aa-883712fc8dcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.319 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Start _get_guest_xml network_info=[{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.323 187132 WARNING nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.328 187132 DEBUG nova.virt.libvirt.host [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.328 187132 DEBUG nova.virt.libvirt.host [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.333 187132 DEBUG nova.virt.libvirt.host [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.334 187132 DEBUG nova.virt.libvirt.host [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.335 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.335 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.335 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.336 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.336 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.336 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.336 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.336 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.337 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.337 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.337 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.337 187132 DEBUG nova.virt.hardware [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.341 187132 DEBUG nova.virt.libvirt.vif [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010928387',display_name='tempest-TestNetworkBasicOps-server-1010928387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010928387',id=33,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChn3ME9G0ksjL7sDCwE+QfyqqlTHK8p/F3BNnS1ZWdXbZo0Zm3wICJKwIiepJAgzTU7UPI647ehkYD0brI4z155ZR9zh3lTJ1lrapS7o+flyaWl8nGk6DMyBb5QomTfgw==',key_name='tempest-TestNetworkBasicOps-653853826',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-65glwa09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:14:32Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=a64d006a-fa23-4538-a7c4-57160050b331,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.341 187132 DEBUG nova.network.os_vif_util [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.342 187132 DEBUG nova.network.os_vif_util [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.342 187132 DEBUG nova.objects.instance [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'pci_devices' on Instance uuid a64d006a-fa23-4538-a7c4-57160050b331 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.358 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <uuid>a64d006a-fa23-4538-a7c4-57160050b331</uuid>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <name>instance-00000021</name>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkBasicOps-server-1010928387</nova:name>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:14:38</nova:creationTime>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:user uuid="3b482a000b3e4b5c964be05bad2a0418">tempest-TestNetworkBasicOps-1486719489-project-member</nova:user>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:project uuid="fce35ab888e44e46b3108813dcdf4163">tempest-TestNetworkBasicOps-1486719489</nova:project>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        <nova:port uuid="39751729-025a-4280-89aa-883712fc8dcb">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.11" ipVersion="4"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="serial">a64d006a-fa23-4538-a7c4-57160050b331</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="uuid">a64d006a-fa23-4538-a7c4-57160050b331</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.config"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:34:76:eb"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <target dev="tap39751729-02"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/console.log" append="off"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:14:38 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:14:38 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:14:38 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:14:38 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.359 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Preparing to wait for external event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.359 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.360 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.360 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.360 187132 DEBUG nova.virt.libvirt.vif [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010928387',display_name='tempest-TestNetworkBasicOps-server-1010928387',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010928387',id=33,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChn3ME9G0ksjL7sDCwE+QfyqqlTHK8p/F3BNnS1ZWdXbZo0Zm3wICJKwIiepJAgzTU7UPI647ehkYD0brI4z155ZR9zh3lTJ1lrapS7o+flyaWl8nGk6DMyBb5QomTfgw==',key_name='tempest-TestNetworkBasicOps-653853826',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-65glwa09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:14:32Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=a64d006a-fa23-4538-a7c4-57160050b331,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.361 187132 DEBUG nova.network.os_vif_util [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.361 187132 DEBUG nova.network.os_vif_util [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.362 187132 DEBUG os_vif [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.362 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.363 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.363 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.365 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.365 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap39751729-02, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.366 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap39751729-02, col_values=(('external_ids', {'iface-id': '39751729-025a-4280-89aa-883712fc8dcb', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:34:76:eb', 'vm-uuid': 'a64d006a-fa23-4538-a7c4-57160050b331'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.367 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:38 np0005554845 NetworkManager[55529]: <info>  [1765433678.3681] manager: (tap39751729-02): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/111)
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.375 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.375 187132 INFO os_vif [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02')#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.524 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.524 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.525 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] No VIF found with MAC fa:16:3e:34:76:eb, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:14:38 np0005554845 nova_compute[187128]: 2025-12-11 06:14:38.525 187132 INFO nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Using config drive#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.380 187132 INFO nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Creating config drive at /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.config#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.389 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmnk92h2i execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.516 187132 DEBUG oslo_concurrency.processutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpmnk92h2i" returned: 0 in 0.127s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:14:39 np0005554845 kernel: tap39751729-02: entered promiscuous mode
Dec 11 01:14:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:39Z|00220|binding|INFO|Claiming lport 39751729-025a-4280-89aa-883712fc8dcb for this chassis.
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.5905] manager: (tap39751729-02): new Tun device (/org/freedesktop/NetworkManager/Devices/112)
Dec 11 01:14:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:39Z|00221|binding|INFO|39751729-025a-4280-89aa-883712fc8dcb: Claiming fa:16:3e:34:76:eb 10.100.0.11
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.589 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.595 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.618 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:76:eb 10.100.0.11'], port_security=['fa:16:3e:34:76:eb 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a64d006a-fa23-4538-a7c4-57160050b331', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '2', 'neutron:security_group_ids': '0b532bd0-5331-4d54-b5de-ee38552da3b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2b0d82-7dfc-4180-a13c-cadf8e0c53cc, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=39751729-025a-4280-89aa-883712fc8dcb) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.619 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 39751729-025a-4280-89aa-883712fc8dcb in datapath 15a011b8-4d7f-4851-9aed-d01bb5a29d21 bound to our chassis#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.621 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 15a011b8-4d7f-4851-9aed-d01bb5a29d21#033[00m
Dec 11 01:14:39 np0005554845 systemd-udevd[221095]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.635 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[efe1a211-a8e4-4df7-b0b6-23404fdc84bd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.636 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap15a011b8-41 in ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.638 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap15a011b8-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.638 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4dc8ff57-f43e-4b70-a1e8-4d9e04c3b791]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.6400] device (tap39751729-02): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.6420] device (tap39751729-02): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.642 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[39a5aeea-7578-4cd7-aecf-5345c0c49366]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.656 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[93c9a011-22be-4edc-a57c-eed774c17164]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 systemd-machined[153381]: New machine qemu-16-instance-00000021.
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.670 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:39Z|00222|binding|INFO|Setting lport 39751729-025a-4280-89aa-883712fc8dcb ovn-installed in OVS
Dec 11 01:14:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:39Z|00223|binding|INFO|Setting lport 39751729-025a-4280-89aa-883712fc8dcb up in Southbound
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.678 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.680 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[310594e3-77fc-4e4d-91af-fa0864265325]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 systemd[1]: Started Virtual Machine qemu-16-instance-00000021.
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.710 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[257a920d-ad79-44eb-8ff1-3fc30b1672b6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.7162] manager: (tap15a011b8-40): new Veth device (/org/freedesktop/NetworkManager/Devices/113)
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.716 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[aebaf3a2-0ba2-41ad-8bc6-8646f41cdd84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 systemd-udevd[221100]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.744 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[1eda178a-0751-43e4-8aa3-92b8a5a4b269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.746 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f31b977f-06d2-4de4-ae5f-63e784239072]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.7684] device (tap15a011b8-40): carrier: link connected
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.772 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[bba6f9da-9b3f-4f29-bacd-61eee47edf4f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.789 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1b6418e5-5420-4cf2-b822-6f7477e36d06]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15a011b8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:e3:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395441, 'reachable_time': 40482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221131, 'error': None, 'target': 'ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.804 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[89a4962c-9f2d-406a-86d3-1fd4986412f8]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe71:e30e'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 395441, 'tstamp': 395441}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221132, 'error': None, 'target': 'ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.819 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[949988a7-db05-42f5-9969-82fd350c890a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap15a011b8-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:71:e3:0e'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 67], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395441, 'reachable_time': 40482, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221133, 'error': None, 'target': 'ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.854 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8b7724aa-f778-4199-a5ab-d8a92d0a405d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.922 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[337d65c8-49e9-4af6-92f3-beaf602f539a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.924 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15a011b8-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.924 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.925 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap15a011b8-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.956 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 kernel: tap15a011b8-40: entered promiscuous mode
Dec 11 01:14:39 np0005554845 NetworkManager[55529]: <info>  [1765433679.9679] manager: (tap15a011b8-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/114)
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.967 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.968 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap15a011b8-40, col_values=(('external_ids', {'iface-id': 'ffe8727c-945e-47a3-8743-f5ce3d704cdf'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.970 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:39Z|00224|binding|INFO|Releasing lport ffe8727c-945e-47a3-8743-f5ce3d704cdf from this chassis (sb_readonly=0)
Dec 11 01:14:39 np0005554845 nova_compute[187128]: 2025-12-11 06:14:39.995 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.996 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/15a011b8-4d7f-4851-9aed-d01bb5a29d21.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/15a011b8-4d7f-4851-9aed-d01bb5a29d21.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.997 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ae7cfff7-fd69-41b1-985b-963d171fb0f9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:39.998 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-15a011b8-4d7f-4851-9aed-d01bb5a29d21
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/15a011b8-4d7f-4851-9aed-d01bb5a29d21.pid.haproxy
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:14:39 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 15a011b8-4d7f-4851-9aed-d01bb5a29d21
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:14:40 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:40.000 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'env', 'PROCESS_TAG=haproxy-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/15a011b8-4d7f-4851-9aed-d01bb5a29d21.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.169 187132 DEBUG nova.network.neutron [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updated VIF entry in instance network info cache for port 39751729-025a-4280-89aa-883712fc8dcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.170 187132 DEBUG nova.network.neutron [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.190 187132 DEBUG oslo_concurrency.lockutils [req-0adf9ac6-1261-40b6-a02f-3f833abae1d6 req-ca7d33c7-9752-4a6f-8a25-e7f28f63d88e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.270 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433680.2693028, a64d006a-fa23-4538-a7c4-57160050b331 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.271 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] VM Started (Lifecycle Event)#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.290 187132 DEBUG nova.compute.manager [req-3967d5b2-7d08-43bc-b496-be91cf534c8e req-c234d94b-2552-4912-a656-5739141be057 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.291 187132 DEBUG oslo_concurrency.lockutils [req-3967d5b2-7d08-43bc-b496-be91cf534c8e req-c234d94b-2552-4912-a656-5739141be057 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.291 187132 DEBUG oslo_concurrency.lockutils [req-3967d5b2-7d08-43bc-b496-be91cf534c8e req-c234d94b-2552-4912-a656-5739141be057 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.291 187132 DEBUG oslo_concurrency.lockutils [req-3967d5b2-7d08-43bc-b496-be91cf534c8e req-c234d94b-2552-4912-a656-5739141be057 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.291 187132 DEBUG nova.compute.manager [req-3967d5b2-7d08-43bc-b496-be91cf534c8e req-c234d94b-2552-4912-a656-5739141be057 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Processing event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.292 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.300 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.301 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.305 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.306 187132 INFO nova.virt.libvirt.driver [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Instance spawned successfully.#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.306 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.326 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.327 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433680.2701955, a64d006a-fa23-4538-a7c4-57160050b331 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.327 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.335 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.336 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.336 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.337 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.337 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.338 187132 DEBUG nova.virt.libvirt.driver [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.347 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.351 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433680.2951317, a64d006a-fa23-4538-a7c4-57160050b331 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.351 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.369 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.372 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.396 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:14:40 np0005554845 podman[221171]: 2025-12-11 06:14:40.398959266 +0000 UTC m=+0.053710115 container create aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.405 187132 INFO nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Took 7.62 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.406 187132 DEBUG nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:14:40 np0005554845 systemd[1]: Started libpod-conmon-aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239.scope.
Dec 11 01:14:40 np0005554845 podman[221171]: 2025-12-11 06:14:40.370973424 +0000 UTC m=+0.025724253 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:14:40 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.473 187132 INFO nova.compute.manager [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Took 10.46 seconds to build instance.#033[00m
Dec 11 01:14:40 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3fe29df74c260cb7a96bdcca09e43414acce3e8197c9c1ed47bb459cf86e5d8a/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:14:40 np0005554845 podman[221171]: 2025-12-11 06:14:40.488574148 +0000 UTC m=+0.143324977 container init aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:14:40 np0005554845 podman[221171]: 2025-12-11 06:14:40.496153134 +0000 UTC m=+0.150903983 container start aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.497 187132 DEBUG oslo_concurrency.lockutils [None req-59a19610-e2c8-4d13-b2ea-d13c8eb285a3 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 10.607s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:40 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [NOTICE]   (221190) : New worker (221192) forked
Dec 11 01:14:40 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [NOTICE]   (221190) : Loading success.
Dec 11 01:14:40 np0005554845 nova_compute[187128]: 2025-12-11 06:14:40.531 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.396 187132 DEBUG nova.compute.manager [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.398 187132 DEBUG oslo_concurrency.lockutils [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.399 187132 DEBUG oslo_concurrency.lockutils [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.400 187132 DEBUG oslo_concurrency.lockutils [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.400 187132 DEBUG nova.compute.manager [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:14:42 np0005554845 nova_compute[187128]: 2025-12-11 06:14:42.401 187132 WARNING nova.compute.manager [req-1de5f112-b10e-48e8-89a6-2e652ff14581 req-d2e358d4-50ef-4317-b546-0b43663b0d60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state None.#033[00m
Dec 11 01:14:43 np0005554845 nova_compute[187128]: 2025-12-11 06:14:43.341 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:43 np0005554845 NetworkManager[55529]: <info>  [1765433683.3425] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/115)
Dec 11 01:14:43 np0005554845 NetworkManager[55529]: <info>  [1765433683.3442] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/116)
Dec 11 01:14:43 np0005554845 nova_compute[187128]: 2025-12-11 06:14:43.367 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:43 np0005554845 nova_compute[187128]: 2025-12-11 06:14:43.411 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:43Z|00225|binding|INFO|Releasing lport ffe8727c-945e-47a3-8743-f5ce3d704cdf from this chassis (sb_readonly=0)
Dec 11 01:14:43 np0005554845 nova_compute[187128]: 2025-12-11 06:14:43.433 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:14:43.789 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '13'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:14:44 np0005554845 nova_compute[187128]: 2025-12-11 06:14:44.714 187132 DEBUG nova.compute.manager [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-changed-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:14:44 np0005554845 nova_compute[187128]: 2025-12-11 06:14:44.714 187132 DEBUG nova.compute.manager [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing instance network info cache due to event network-changed-39751729-025a-4280-89aa-883712fc8dcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:14:44 np0005554845 nova_compute[187128]: 2025-12-11 06:14:44.715 187132 DEBUG oslo_concurrency.lockutils [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:14:44 np0005554845 nova_compute[187128]: 2025-12-11 06:14:44.715 187132 DEBUG oslo_concurrency.lockutils [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:14:44 np0005554845 nova_compute[187128]: 2025-12-11 06:14:44.716 187132 DEBUG nova.network.neutron [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing network info cache for port 39751729-025a-4280-89aa-883712fc8dcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:14:45 np0005554845 nova_compute[187128]: 2025-12-11 06:14:45.574 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:46 np0005554845 nova_compute[187128]: 2025-12-11 06:14:46.125 187132 DEBUG nova.network.neutron [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updated VIF entry in instance network info cache for port 39751729-025a-4280-89aa-883712fc8dcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:14:46 np0005554845 nova_compute[187128]: 2025-12-11 06:14:46.125 187132 DEBUG nova.network.neutron [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:14:46 np0005554845 nova_compute[187128]: 2025-12-11 06:14:46.146 187132 DEBUG oslo_concurrency.lockutils [req-251843c4-2536-4f85-8b66-8f420cf6e9ca req-bf018b2c-d71d-47f6-86d4-24ff75d78d3a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:14:48 np0005554845 nova_compute[187128]: 2025-12-11 06:14:48.370 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:50 np0005554845 podman[221202]: 2025-12-11 06:14:50.12611303 +0000 UTC m=+0.056183871 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:14:50 np0005554845 nova_compute[187128]: 2025-12-11 06:14:50.576 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:52Z|00030|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:34:76:eb 10.100.0.11
Dec 11 01:14:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:14:52Z|00031|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:34:76:eb 10.100.0.11
Dec 11 01:14:53 np0005554845 nova_compute[187128]: 2025-12-11 06:14:53.372 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:55 np0005554845 nova_compute[187128]: 2025-12-11 06:14:55.578 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:56 np0005554845 podman[221241]: 2025-12-11 06:14:56.129004916 +0000 UTC m=+0.063304165 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 01:14:58 np0005554845 nova_compute[187128]: 2025-12-11 06:14:58.376 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:14:59 np0005554845 podman[221263]: 2025-12-11 06:14:59.125133933 +0000 UTC m=+0.059080570 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 11 01:14:59 np0005554845 podman[221264]: 2025-12-11 06:14:59.162139081 +0000 UTC m=+0.093104087 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 01:15:00 np0005554845 nova_compute[187128]: 2025-12-11 06:15:00.580 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:01 np0005554845 podman[221306]: 2025-12-11 06:15:01.127254578 +0000 UTC m=+0.066121173 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible)
Dec 11 01:15:03 np0005554845 nova_compute[187128]: 2025-12-11 06:15:03.378 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:05 np0005554845 podman[221328]: 2025-12-11 06:15:05.144320805 +0000 UTC m=+0.065130335 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, maintainer=Red Hat, Inc., vcs-type=git, version=9.6, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., managed_by=edpm_ansible, release=1755695350)
Dec 11 01:15:05 np0005554845 podman[221327]: 2025-12-11 06:15:05.173222923 +0000 UTC m=+0.092121721 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:15:05 np0005554845 nova_compute[187128]: 2025-12-11 06:15:05.582 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:08 np0005554845 nova_compute[187128]: 2025-12-11 06:15:08.381 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.193 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.194 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.206 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.284 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.285 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.294 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.295 187132 INFO nova.compute.claims [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.451 187132 DEBUG nova.compute.provider_tree [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.465 187132 DEBUG nova.scheduler.client.report [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.500 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.215s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.501 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.562 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.562 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.585 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.592 187132 INFO nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.609 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.846 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.847 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.848 187132 INFO nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Creating image(s)#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.848 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.849 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.849 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.864 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.941 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.076s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.942 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.942 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:10 np0005554845 nova_compute[187128]: 2025-12-11 06:15:10.956 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.011 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.012 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.213 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk 1073741824" returned: 0 in 0.201s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.214 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.272s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.215 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.270 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.272 187132 DEBUG nova.virt.disk.api [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.272 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.328 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.329 187132 DEBUG nova.virt.disk.api [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.329 187132 DEBUG nova.objects.instance [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 4d2c2d90-6514-4e53-b77f-30e376bcb3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.347 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.348 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Ensure instance console log exists: /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.349 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.349 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.350 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:11 np0005554845 nova_compute[187128]: 2025-12-11 06:15:11.574 187132 DEBUG nova.policy [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:15:13 np0005554845 nova_compute[187128]: 2025-12-11 06:15:13.383 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:14 np0005554845 nova_compute[187128]: 2025-12-11 06:15:14.687 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Successfully created port: 63929a65-2f0b-481d-982f-4101b7879484 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:15:15 np0005554845 nova_compute[187128]: 2025-12-11 06:15:15.586 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:15 np0005554845 nova_compute[187128]: 2025-12-11 06:15:15.614 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Successfully created port: 32cfa82e-ef0b-43a6-9378-5eec85606901 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.740 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Successfully updated port: 63929a65-2f0b-481d-982f-4101b7879484 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.844 187132 DEBUG nova.compute.manager [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-changed-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.844 187132 DEBUG nova.compute.manager [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing instance network info cache due to event network-changed-63929a65-2f0b-481d-982f-4101b7879484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.845 187132 DEBUG oslo_concurrency.lockutils [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.845 187132 DEBUG oslo_concurrency.lockutils [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:17 np0005554845 nova_compute[187128]: 2025-12-11 06:15:17.845 187132 DEBUG nova.network.neutron [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing network info cache for port 63929a65-2f0b-481d-982f-4101b7879484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.386 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.493 187132 DEBUG nova.network.neutron [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.643 187132 INFO nova.compute.manager [None req-6aae4d67-d363-4fb3-a423-0203382a4c10 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Get console output#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.648 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.926 187132 DEBUG nova.network.neutron [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:18 np0005554845 nova_compute[187128]: 2025-12-11 06:15:18.944 187132 DEBUG oslo_concurrency.lockutils [req-d4631417-90a4-42e0-a54c-84798eaf7851 req-36880638-7668-4891-9486-72c0674532dd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.240 187132 DEBUG nova.compute.manager [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-changed-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.240 187132 DEBUG nova.compute.manager [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing instance network info cache due to event network-changed-39751729-025a-4280-89aa-883712fc8dcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.240 187132 DEBUG oslo_concurrency.lockutils [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.241 187132 DEBUG oslo_concurrency.lockutils [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.241 187132 DEBUG nova.network.neutron [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing network info cache for port 39751729-025a-4280-89aa-883712fc8dcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.720 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Successfully updated port: 32cfa82e-ef0b-43a6-9378-5eec85606901 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.744 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.744 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.744 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.901 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.915 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.915 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 WARNING nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state None.#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-changed-32cfa82e-ef0b-43a6-9378-5eec85606901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.916 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing instance network info cache due to event network-changed-32cfa82e-ef0b-43a6-9378-5eec85606901. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:19 np0005554845 nova_compute[187128]: 2025-12-11 06:15:19.917 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:20 np0005554845 nova_compute[187128]: 2025-12-11 06:15:20.297 187132 INFO nova.compute.manager [None req-11b2ebef-a6f2-4ada-a398-9c3505dee043 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Get console output#033[00m
Dec 11 01:15:20 np0005554845 nova_compute[187128]: 2025-12-11 06:15:20.302 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:15:20 np0005554845 nova_compute[187128]: 2025-12-11 06:15:20.590 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:21 np0005554845 podman[221387]: 2025-12-11 06:15:21.127563374 +0000 UTC m=+0.052824030 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:15:21 np0005554845 nova_compute[187128]: 2025-12-11 06:15:21.973 187132 DEBUG nova.compute.manager [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-changed-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:21 np0005554845 nova_compute[187128]: 2025-12-11 06:15:21.973 187132 DEBUG nova.compute.manager [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing instance network info cache due to event network-changed-39751729-025a-4280-89aa-883712fc8dcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:21 np0005554845 nova_compute[187128]: 2025-12-11 06:15:21.973 187132 DEBUG oslo_concurrency.lockutils [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.005 187132 DEBUG nova.compute.manager [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.006 187132 DEBUG oslo_concurrency.lockutils [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.006 187132 DEBUG oslo_concurrency.lockutils [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.007 187132 DEBUG oslo_concurrency.lockutils [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.007 187132 DEBUG nova.compute.manager [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.007 187132 WARNING nova.compute.manager [req-1183fbb4-42d7-489c-aef7-a9f4e1b61e9d req-199796a7-dd4e-4e1d-8e94-8ad23f1485f5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state None.#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.115 187132 INFO nova.compute.manager [None req-6d5603a4-c575-4720-b286-1137389afe6a 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Get console output#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.122 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.607 187132 DEBUG nova.network.neutron [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updated VIF entry in instance network info cache for port 39751729-025a-4280-89aa-883712fc8dcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.607 187132 DEBUG nova.network.neutron [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.631 187132 DEBUG oslo_concurrency.lockutils [req-592bb89c-e824-4dd6-9b24-6bd30e428b12 req-310f04fc-4fb5-423b-bf3a-79e4673db1c4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.632 187132 DEBUG oslo_concurrency.lockutils [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:22 np0005554845 nova_compute[187128]: 2025-12-11 06:15:22.633 187132 DEBUG nova.network.neutron [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing network info cache for port 39751729-025a-4280-89aa-883712fc8dcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.388 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.962 187132 DEBUG nova.network.neutron [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.986 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.986 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance network_info: |[{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.986 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.987 187132 DEBUG nova.network.neutron [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing network info cache for port 32cfa82e-ef0b-43a6-9378-5eec85606901 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.990 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Start _get_guest_xml network_info=[{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:15:23 np0005554845 nova_compute[187128]: 2025-12-11 06:15:23.995 187132 WARNING nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.001 187132 DEBUG nova.virt.libvirt.host [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.002 187132 DEBUG nova.virt.libvirt.host [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.010 187132 DEBUG nova.virt.libvirt.host [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.010 187132 DEBUG nova.virt.libvirt.host [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.011 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.012 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.012 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.012 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.013 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.013 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.013 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.013 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.013 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.014 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.014 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.014 187132 DEBUG nova.virt.hardware [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.018 187132 DEBUG nova.virt.libvirt.vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:10Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.018 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.019 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.019 187132 DEBUG nova.virt.libvirt.vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:10Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.020 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.020 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.021 187132 DEBUG nova.objects.instance [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 4d2c2d90-6514-4e53-b77f-30e376bcb3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.038 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <uuid>4d2c2d90-6514-4e53-b77f-30e376bcb3ab</uuid>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <name>instance-00000025</name>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-1495480396</nova:name>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:15:23</nova:creationTime>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:port uuid="63929a65-2f0b-481d-982f-4101b7879484">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.5" ipVersion="4"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        <nova:port uuid="32cfa82e-ef0b-43a6-9378-5eec85606901">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe79:2655" ipVersion="6"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="serial">4d2c2d90-6514-4e53-b77f-30e376bcb3ab</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="uuid">4d2c2d90-6514-4e53-b77f-30e376bcb3ab</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.config"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:1c:f4:66"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <target dev="tap63929a65-2f"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:79:26:55"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <target dev="tap32cfa82e-ef"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/console.log" append="off"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:15:24 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:15:24 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:15:24 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:15:24 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.039 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Preparing to wait for external event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.039 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Preparing to wait for external event network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.040 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.041 187132 DEBUG nova.virt.libvirt.vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:10Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.041 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.042 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.042 187132 DEBUG os_vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.043 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.043 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.044 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.047 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.047 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap63929a65-2f, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.048 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap63929a65-2f, col_values=(('external_ids', {'iface-id': '63929a65-2f0b-481d-982f-4101b7879484', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:1c:f4:66', 'vm-uuid': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.049 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 NetworkManager[55529]: <info>  [1765433724.0505] manager: (tap63929a65-2f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/117)
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.052 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.056 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.057 187132 INFO os_vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f')#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.058 187132 DEBUG nova.virt.libvirt.vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:10Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.058 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.059 187132 DEBUG nova.network.os_vif_util [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.059 187132 DEBUG os_vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.059 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.059 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.059 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.062 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.062 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap32cfa82e-ef, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.062 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap32cfa82e-ef, col_values=(('external_ids', {'iface-id': '32cfa82e-ef0b-43a6-9378-5eec85606901', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:26:55', 'vm-uuid': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.063 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 NetworkManager[55529]: <info>  [1765433724.0646] manager: (tap32cfa82e-ef): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/118)
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.066 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.070 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.070 187132 INFO os_vif [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef')#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.130 187132 DEBUG nova.compute.manager [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.130 187132 DEBUG oslo_concurrency.lockutils [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.130 187132 DEBUG oslo_concurrency.lockutils [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.131 187132 DEBUG oslo_concurrency.lockutils [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.131 187132 DEBUG nova.compute.manager [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.131 187132 WARNING nova.compute.manager [req-c0db1a27-ee7a-45ae-b457-d30f79b717c5 req-ee7077e0-a607-4250-b804-64819fb7dc86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state None.#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.152 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.152 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.153 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:1c:f4:66, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.153 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:79:26:55, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:15:24 np0005554845 nova_compute[187128]: 2025-12-11 06:15:24.153 187132 INFO nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Using config drive#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.292 187132 INFO nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Creating config drive at /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.config#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.297 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7rep3t_h execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.347 187132 DEBUG nova.network.neutron [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updated VIF entry in instance network info cache for port 39751729-025a-4280-89aa-883712fc8dcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.348 187132 DEBUG nova.network.neutron [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.367 187132 DEBUG oslo_concurrency.lockutils [req-da574579-3483-4b48-a8bf-d0f1c03386db req-a74c4fa9-d92a-4352-a830-ac8f96eb3a4c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.425 187132 DEBUG oslo_concurrency.processutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp7rep3t_h" returned: 0 in 0.128s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.4801] manager: (tap63929a65-2f): new Tun device (/org/freedesktop/NetworkManager/Devices/119)
Dec 11 01:15:25 np0005554845 kernel: tap63929a65-2f: entered promiscuous mode
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00226|binding|INFO|Claiming lport 63929a65-2f0b-481d-982f-4101b7879484 for this chassis.
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00227|binding|INFO|63929a65-2f0b-481d-982f-4101b7879484: Claiming fa:16:3e:1c:f4:66 10.100.0.5
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.486 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.494 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:f4:66 10.100.0.5'], port_security=['fa:16:3e:1c:f4:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70dc7c03-2005-47cf-a898-b31c3c862049', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '74c01a76-4421-4dd2-a8ba-7cd22c52b13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0310888c-82ff-49ca-8771-e30be2399d81, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=63929a65-2f0b-481d-982f-4101b7879484) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.495 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 63929a65-2f0b-481d-982f-4101b7879484 in datapath 70dc7c03-2005-47cf-a898-b31c3c862049 bound to our chassis#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.498 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 70dc7c03-2005-47cf-a898-b31c3c862049#033[00m
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5003] manager: (tap32cfa82e-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/120)
Dec 11 01:15:25 np0005554845 kernel: tap32cfa82e-ef: entered promiscuous mode
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00228|binding|INFO|Setting lport 63929a65-2f0b-481d-982f-4101b7879484 ovn-installed in OVS
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00229|binding|INFO|Setting lport 63929a65-2f0b-481d-982f-4101b7879484 up in Southbound
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.508 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.510 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00230|if_status|INFO|Not updating pb chassis for 32cfa82e-ef0b-43a6-9378-5eec85606901 now as sb is readonly
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00231|binding|INFO|Claiming lport 32cfa82e-ef0b-43a6-9378-5eec85606901 for this chassis.
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00232|binding|INFO|32cfa82e-ef0b-43a6-9378-5eec85606901: Claiming fa:16:3e:79:26:55 2001:db8::f816:3eff:fe79:2655
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.514 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[61aab405-81eb-4bc9-a4f2-4d52b7629652]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.514 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap70dc7c03-21 in ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.516 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap70dc7c03-20 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.516 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2533719d-5a4f-4788-9ff8-5df10ed462e1]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.517 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ea922381-4ac0-4d98-b897-7625e6c70157]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 systemd-udevd[221438]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:15:25 np0005554845 systemd-udevd[221439]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.525 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:26:55 2001:db8::f816:3eff:fe79:2655'], port_security=['fa:16:3e:79:26:55 2001:db8::f816:3eff:fe79:2655'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe79:2655/64', 'neutron:device_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '74c01a76-4421-4dd2-a8ba-7cd22c52b13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c87d4ad4-67f8-4533-bf56-ecd743daead8, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=32cfa82e-ef0b-43a6-9378-5eec85606901) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.528 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00233|binding|INFO|Setting lport 32cfa82e-ef0b-43a6-9378-5eec85606901 ovn-installed in OVS
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00234|binding|INFO|Setting lport 32cfa82e-ef0b-43a6-9378-5eec85606901 up in Southbound
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.532 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.533 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.534 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[7ae9a45a-8a55-4a9a-bd68-44bb75f447cd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5404] device (tap32cfa82e-ef): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5410] device (tap32cfa82e-ef): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5454] device (tap63929a65-2f): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5457] device (tap63929a65-2f): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:15:25 np0005554845 systemd-machined[153381]: New machine qemu-17-instance-00000025.
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.548 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[103f2465-af2a-4244-b49f-7c74d2a6b76a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 systemd[1]: Started Virtual Machine qemu-17-instance-00000025.
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.583 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[aa02a19a-a0d4-4665-9478-ddf8c2aa2901]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.5926] manager: (tap70dc7c03-20): new Veth device (/org/freedesktop/NetworkManager/Devices/121)
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.592 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a846dea1-16c0-43ec-80a7-506b6393d043]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.593 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.625 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0cfc0b56-ee43-4483-9e65-74e6b86a2ddc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.628 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[47b50091-e130-4f43-93ff-1e403260fc0e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.6542] device (tap70dc7c03-20): carrier: link connected
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.660 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[be91c29c-4e6a-4ad1-b272-2a78801cdcfa]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.679 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[995df0eb-20ab-41e1-a8f8-6118d3811acb]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70dc7c03-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:ca:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400030, 'reachable_time': 28333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221474, 'error': None, 'target': 'ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.703 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ff201385-f60a-4597-a593-3ca72de5c813]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe06:ca6c'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 400030, 'tstamp': 400030}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221475, 'error': None, 'target': 'ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.725 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5e6e8c04-824b-483a-84b7-966727b728ab]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70dc7c03-21'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:06:ca:6c'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 70], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400030, 'reachable_time': 28333, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221476, 'error': None, 'target': 'ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.766 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[eb2485c4-1bb0-4013-a3ed-ee5ca58962e2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.841 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0b994586-561f-40f1-b127-20001a93ce48]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.842 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70dc7c03-20, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.843 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.843 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70dc7c03-20, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:25 np0005554845 kernel: tap70dc7c03-20: entered promiscuous mode
Dec 11 01:15:25 np0005554845 NetworkManager[55529]: <info>  [1765433725.8460] manager: (tap70dc7c03-20): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/122)
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.845 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.848 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap70dc7c03-20, col_values=(('external_ids', {'iface-id': 'f9fad358-fd40-4b95-a136-3d0381c2bd29'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:25Z|00235|binding|INFO|Releasing lport f9fad358-fd40-4b95-a136-3d0381c2bd29 from this chassis (sb_readonly=0)
Dec 11 01:15:25 np0005554845 nova_compute[187128]: 2025-12-11 06:15:25.860 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.863 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/70dc7c03-2005-47cf-a898-b31c3c862049.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/70dc7c03-2005-47cf-a898-b31c3c862049.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.864 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c7ccc38d-8b22-4a0a-a2e6-a0da1448dd78]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.866 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-70dc7c03-2005-47cf-a898-b31c3c862049
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/70dc7c03-2005-47cf-a898-b31c3c862049.pid.haproxy
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 70dc7c03-2005-47cf-a898-b31c3c862049
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:15:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:25.866 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049', 'env', 'PROCESS_TAG=haproxy-70dc7c03-2005-47cf-a898-b31c3c862049', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/70dc7c03-2005-47cf-a898-b31c3c862049.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.205 187132 DEBUG nova.compute.manager [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.205 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.205 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.206 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.206 187132 DEBUG nova.compute.manager [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Processing event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.206 187132 DEBUG nova.compute.manager [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.206 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.207 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.207 187132 DEBUG oslo_concurrency.lockutils [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.207 187132 DEBUG nova.compute.manager [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] No event matching network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 in dict_keys([('network-vif-plugged', '32cfa82e-ef0b-43a6-9378-5eec85606901')]) pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:325#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.207 187132 WARNING nova.compute.manager [req-a9d7e328-a480-4e0e-9329-7c051ba6d621 req-ea808ca1-d711-45d1-970c-3ee83511dccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received unexpected event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.227 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.228 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.230 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:26 np0005554845 podman[221508]: 2025-12-11 06:15:26.256697541 +0000 UTC m=+0.068681842 container create b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:15:26 np0005554845 systemd[1]: Started libpod-conmon-b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7.scope.
Dec 11 01:15:26 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:15:26 np0005554845 podman[221508]: 2025-12-11 06:15:26.21922725 +0000 UTC m=+0.031211591 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:15:26 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/7bd4cf718ec006296757ce05315646fa2a1bf95b2c99308ce08ab43eeae15849/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.327 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433726.3269124, 4d2c2d90-6514-4e53-b77f-30e376bcb3ab => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.328 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] VM Started (Lifecycle Event)#033[00m
Dec 11 01:15:26 np0005554845 podman[221508]: 2025-12-11 06:15:26.330891383 +0000 UTC m=+0.142875714 container init b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:15:26 np0005554845 podman[221508]: 2025-12-11 06:15:26.336540637 +0000 UTC m=+0.148524938 container start b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.346 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.357 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433726.3273587, 4d2c2d90-6514-4e53-b77f-30e376bcb3ab => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.357 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:15:26 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [NOTICE]   (221551) : New worker (221556) forked
Dec 11 01:15:26 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [NOTICE]   (221551) : Loading success.
Dec 11 01:15:26 np0005554845 podman[221528]: 2025-12-11 06:15:26.385294895 +0000 UTC m=+0.090271081 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.390 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.393 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.395 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 32cfa82e-ef0b-43a6-9378-5eec85606901 in datapath 5fd5f2b9-1570-4922-9d37-b3acee2aa306 unbound from our chassis#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.397 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5fd5f2b9-1570-4922-9d37-b3acee2aa306#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.407 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[374c8f8a-4d67-4071-885c-25cd802bef71]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.407 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5fd5f2b9-11 in ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.409 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5fd5f2b9-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.409 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[54281236-2058-4a15-8692-1ccde4adacf0]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.410 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b001cea5-3488-42c6-aad2-e109a73a0e6d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.420 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[bae1ab7e-3c3e-4f0f-a22c-ee82168451cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.422 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.433 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[51327e16-7e4d-49c5-a352-a9be283c2012]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.457 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[756269fb-de26-4ff0-bcd6-e654832e2f9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.464 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e7e21622-6f28-4945-b22c-e78ab775716f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 NetworkManager[55529]: <info>  [1765433726.4652] manager: (tap5fd5f2b9-10): new Veth device (/org/freedesktop/NetworkManager/Devices/123)
Dec 11 01:15:26 np0005554845 systemd-udevd[221468]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.496 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0ccd8758-d304-4482-bb8f-e2b03dbbd67d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.499 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[69bcb389-8d92-4120-bf1e-beed4017d958]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 NetworkManager[55529]: <info>  [1765433726.5200] device (tap5fd5f2b9-10): carrier: link connected
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.523 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[0ad3fda0-48a4-42cb-a6cc-5166e16da0a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.547 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b731b23a-1ab2-447b-85f8-2d2097a7e7b8]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fd5f2b9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:cc:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400116, 'reachable_time': 40124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221575, 'error': None, 'target': 'ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.565 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2ffa6602-f8d4-4cfc-a39f-14264cab8fdb]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feaf:cc3f'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 400116, 'tstamp': 400116}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 221576, 'error': None, 'target': 'ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.587 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7d80d06b-d9f9-411b-b44b-598e28e770c6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5fd5f2b9-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:af:cc:3f'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 71], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400116, 'reachable_time': 40124, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 221577, 'error': None, 'target': 'ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.620 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[07f7ec0f-63bf-43f8-b811-a1f5681cdbc3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.651 187132 DEBUG nova.network.neutron [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updated VIF entry in instance network info cache for port 32cfa82e-ef0b-43a6-9378-5eec85606901. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.651 187132 DEBUG nova.network.neutron [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.661 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1263566f-67a4-4df7-be35-69f2348958fe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.663 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fd5f2b9-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.663 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.663 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5fd5f2b9-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:26 np0005554845 kernel: tap5fd5f2b9-10: entered promiscuous mode
Dec 11 01:15:26 np0005554845 NetworkManager[55529]: <info>  [1765433726.6666] manager: (tap5fd5f2b9-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/124)
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.665 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.669 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5fd5f2b9-10, col_values=(('external_ids', {'iface-id': 'e6bf5322-feeb-40ef-a479-f4418af08bd9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:26 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:26Z|00236|binding|INFO|Releasing lport e6bf5322-feeb-40ef-a479-f4418af08bd9 from this chassis (sb_readonly=0)
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.670 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.672 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.673 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.673 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.673 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.674 187132 DEBUG oslo_concurrency.lockutils [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.674 187132 DEBUG nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.674 187132 WARNING nova.compute.manager [req-b6ef691b-da90-411b-ab7e-44a89ff5186e req-8b913ff3-55b6-44b7-a6c4-3dc15542599c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state None.#033[00m
Dec 11 01:15:26 np0005554845 nova_compute[187128]: 2025-12-11 06:15:26.682 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.683 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5fd5f2b9-1570-4922-9d37-b3acee2aa306.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5fd5f2b9-1570-4922-9d37-b3acee2aa306.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.684 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6c3b2c93-c744-4fbd-bd77-c922e77e3379]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.685 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-5fd5f2b9-1570-4922-9d37-b3acee2aa306
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/5fd5f2b9-1570-4922-9d37-b3acee2aa306.pid.haproxy
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 5fd5f2b9-1570-4922-9d37-b3acee2aa306
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:15:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:26.685 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'env', 'PROCESS_TAG=haproxy-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5fd5f2b9-1570-4922-9d37-b3acee2aa306.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:15:27 np0005554845 podman[221606]: 2025-12-11 06:15:27.012789863 +0000 UTC m=+0.043333722 container create 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 11 01:15:27 np0005554845 systemd[1]: Started libpod-conmon-5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a.scope.
Dec 11 01:15:27 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:15:27 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/3da898272f46ab7b3f3aadf00b44d6d92647b9d8e6d81fe1c77f543d417c805e/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:15:27 np0005554845 podman[221606]: 2025-12-11 06:15:27.079795379 +0000 UTC m=+0.110339288 container init 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 11 01:15:27 np0005554845 podman[221606]: 2025-12-11 06:15:27.085630378 +0000 UTC m=+0.116174247 container start 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:15:27 np0005554845 podman[221606]: 2025-12-11 06:15:26.990412383 +0000 UTC m=+0.020956272 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [NOTICE]   (221625) : New worker (221627) forked
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [NOTICE]   (221625) : Loading success.
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.239 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.239 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.240 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.240 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.240 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.241 187132 INFO nova.compute.manager [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Terminating instance#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.242 187132 DEBUG nova.compute.manager [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:15:27 np0005554845 kernel: tap39751729-02 (unregistering): left promiscuous mode
Dec 11 01:15:27 np0005554845 NetworkManager[55529]: <info>  [1765433727.2614] device (tap39751729-02): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:15:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:27Z|00237|binding|INFO|Releasing lport 39751729-025a-4280-89aa-883712fc8dcb from this chassis (sb_readonly=0)
Dec 11 01:15:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:27Z|00238|binding|INFO|Setting lport 39751729-025a-4280-89aa-883712fc8dcb down in Southbound
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.300 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:27Z|00239|binding|INFO|Removing iface tap39751729-02 ovn-installed in OVS
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.315 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:34:76:eb 10.100.0.11'], port_security=['fa:16:3e:34:76:eb 10.100.0.11'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.11/28', 'neutron:device_id': 'a64d006a-fa23-4538-a7c4-57160050b331', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': 'fce35ab888e44e46b3108813dcdf4163', 'neutron:revision_number': '8', 'neutron:security_group_ids': '0b532bd0-5331-4d54-b5de-ee38552da3b9', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=3b2b0d82-7dfc-4180-a13c-cadf8e0c53cc, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=39751729-025a-4280-89aa-883712fc8dcb) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.317 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 39751729-025a-4280-89aa-883712fc8dcb in datapath 15a011b8-4d7f-4851-9aed-d01bb5a29d21 unbound from our chassis#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.317 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.319 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 15a011b8-4d7f-4851-9aed-d01bb5a29d21, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.320 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7f10144d-6776-4694-b385-d4e49ffcf0ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.321 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21 namespace which is not needed anymore#033[00m
Dec 11 01:15:27 np0005554845 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000021.scope: Deactivated successfully.
Dec 11 01:15:27 np0005554845 systemd[1]: machine-qemu\x2d16\x2dinstance\x2d00000021.scope: Consumed 14.616s CPU time.
Dec 11 01:15:27 np0005554845 systemd-machined[153381]: Machine qemu-16-instance-00000021 terminated.
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.434 187132 DEBUG nova.compute.manager [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.435 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.435 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.435 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.436 187132 DEBUG nova.compute.manager [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Processing event network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.436 187132 DEBUG nova.compute.manager [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.436 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.436 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.437 187132 DEBUG oslo_concurrency.lockutils [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.437 187132 DEBUG nova.compute.manager [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] No waiting events found dispatching network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.437 187132 WARNING nova.compute.manager [req-5a881a37-c364-4e8e-9648-730944dcec50 req-fb2b088c-c7c8-4dfd-ba07-c8ed0b382d12 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received unexpected event network-vif-plugged-32cfa82e-ef0b-43a6-9378-5eec85606901 for instance with vm_state building and task_state spawning.#033[00m
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [NOTICE]   (221190) : haproxy version is 2.8.14-c23fe91
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [NOTICE]   (221190) : path to executable is /usr/sbin/haproxy
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [WARNING]  (221190) : Exiting Master process...
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.438 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance event wait completed in 1 seconds for network-vif-plugged,network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [ALERT]    (221190) : Current worker (221192) exited with code 143 (Terminated)
Dec 11 01:15:27 np0005554845 neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21[221186]: [WARNING]  (221190) : All workers exited. Exiting... (0)
Dec 11 01:15:27 np0005554845 systemd[1]: libpod-aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239.scope: Deactivated successfully.
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.445 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433727.4445255, 4d2c2d90-6514-4e53-b77f-30e376bcb3ab => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.446 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.448 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:15:27 np0005554845 podman[221657]: 2025-12-11 06:15:27.449488902 +0000 UTC m=+0.045035709 container died aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.455 187132 INFO nova.virt.libvirt.driver [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance spawned successfully.#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.456 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.462 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.480 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.491 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.492 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.492 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.493 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.493 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.494 187132 DEBUG nova.virt.libvirt.driver [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:15:27 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239-userdata-shm.mount: Deactivated successfully.
Dec 11 01:15:27 np0005554845 systemd[1]: var-lib-containers-storage-overlay-3fe29df74c260cb7a96bdcca09e43414acce3e8197c9c1ed47bb459cf86e5d8a-merged.mount: Deactivated successfully.
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.510 187132 INFO nova.virt.libvirt.driver [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Instance destroyed successfully.#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.510 187132 DEBUG nova.objects.instance [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lazy-loading 'resources' on Instance uuid a64d006a-fa23-4538-a7c4-57160050b331 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:27 np0005554845 podman[221657]: 2025-12-11 06:15:27.519756777 +0000 UTC m=+0.115303584 container cleanup aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:15:27 np0005554845 systemd[1]: libpod-conmon-aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239.scope: Deactivated successfully.
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.540 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.541 187132 DEBUG nova.virt.libvirt.vif [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:14:27Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkBasicOps-server-1010928387',display_name='tempest-TestNetworkBasicOps-server-1010928387',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkbasicops-server-1010928387',id=33,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBChn3ME9G0ksjL7sDCwE+QfyqqlTHK8p/F3BNnS1ZWdXbZo0Zm3wICJKwIiepJAgzTU7UPI647ehkYD0brI4z155ZR9zh3lTJ1lrapS7o+flyaWl8nGk6DMyBb5QomTfgw==',key_name='tempest-TestNetworkBasicOps-653853826',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:14:40Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='fce35ab888e44e46b3108813dcdf4163',ramdisk_id='',reservation_id='r-65glwa09',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkBasicOps-1486719489',owner_user_name='tempest-TestNetworkBasicOps-1486719489-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:14:40Z,user_data=None,user_id='3b482a000b3e4b5c964be05bad2a0418',uuid=a64d006a-fa23-4538-a7c4-57160050b331,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.541 187132 DEBUG nova.network.os_vif_util [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converting VIF {"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.246", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.541 187132 DEBUG nova.network.os_vif_util [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.542 187132 DEBUG os_vif [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.543 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.543 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap39751729-02, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.544 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.546 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.547 187132 INFO os_vif [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:34:76:eb,bridge_name='br-int',has_traffic_filtering=True,id=39751729-025a-4280-89aa-883712fc8dcb,network=Network(15a011b8-4d7f-4851-9aed-d01bb5a29d21),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap39751729-02')#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.548 187132 INFO nova.virt.libvirt.driver [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Deleting instance files /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331_del#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.548 187132 INFO nova.virt.libvirt.driver [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Deletion of /var/lib/nova/instances/a64d006a-fa23-4538-a7c4-57160050b331_del complete#033[00m
Dec 11 01:15:27 np0005554845 podman[221704]: 2025-12-11 06:15:27.575008402 +0000 UTC m=+0.035748155 container remove aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.586 187132 INFO nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Took 16.74 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.587 187132 DEBUG nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.588 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c093b46f-3694-40c6-88ea-00cbdbf6ff9d]: (4, ('Thu Dec 11 06:15:27 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21 (aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239)\naea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239\nThu Dec 11 06:15:27 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21 (aea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239)\naea0d9731961f9423415323a5c893331065e6c8c3f9045eee028d0be9ff93239\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.590 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b9194893-58bf-4037-9e37-a5660ebaba34]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.595 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap15a011b8-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.596 187132 INFO nova.compute.manager [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.597 187132 DEBUG oslo.service.loopingcall [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.597 187132 DEBUG nova.compute.manager [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.597 187132 DEBUG nova.network.neutron [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.600 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 kernel: tap15a011b8-40: left promiscuous mode
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.617 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.621 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bfcff49c-b29b-4b61-b8a3-ed2e31e7d43e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.631 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3cd67726-5643-4d85-8e4b-a6c054bda26f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.632 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[12af3655-4a34-46e6-b99e-458e6acbc134]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.651 187132 INFO nova.compute.manager [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Took 17.40 seconds to build instance.#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.650 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ac12fe02-7077-4d61-b4dc-39f37248ff8f]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 395435, 'reachable_time': 16188, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221719, 'error': None, 'target': 'ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 systemd[1]: run-netns-ovnmeta\x2d15a011b8\x2d4d7f\x2d4851\x2d9aed\x2dd01bb5a29d21.mount: Deactivated successfully.
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.657 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-15a011b8-4d7f-4851-9aed-d01bb5a29d21 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:15:27 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:27.657 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[81091c69-3d2c-48fb-8df2-b1c759745c5b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:27 np0005554845 nova_compute[187128]: 2025-12-11 06:15:27.665 187132 DEBUG oslo_concurrency.lockutils [None req-06e2f6e7-97b6-4aba-8a22-71779ba6edba 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 17.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.294 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-changed-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.294 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing instance network info cache due to event network-changed-39751729-025a-4280-89aa-883712fc8dcb. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.295 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.295 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.295 187132 DEBUG nova.network.neutron [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Refreshing network info cache for port 39751729-025a-4280-89aa-883712fc8dcb _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.647 187132 DEBUG nova.network.neutron [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.668 187132 INFO nova.compute.manager [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Took 1.07 seconds to deallocate network for instance.#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.715 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.716 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.767 187132 DEBUG nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.789 187132 DEBUG nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.789 187132 DEBUG nova.compute.provider_tree [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.812 187132 DEBUG nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.843 187132 DEBUG nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.940 187132 DEBUG nova.compute.provider_tree [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:15:28 np0005554845 nova_compute[187128]: 2025-12-11 06:15:28.967 187132 DEBUG nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.047 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.331s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.082 187132 INFO nova.scheduler.client.report [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Deleted allocations for instance a64d006a-fa23-4538-a7c4-57160050b331#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.176 187132 DEBUG oslo_concurrency.lockutils [None req-7ea72fbd-3eaf-4b3a-a6da-1928c6f3578d 3b482a000b3e4b5c964be05bad2a0418 fce35ab888e44e46b3108813dcdf4163 - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.936s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.737 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.738 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.739 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.739 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.836 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:29 np0005554845 podman[221722]: 2025-12-11 06:15:29.892343594 +0000 UTC m=+0.095296187 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:15:29 np0005554845 podman[221721]: 2025-12-11 06:15:29.892878359 +0000 UTC m=+0.099427600 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.910 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.911 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:29 np0005554845 nova_compute[187128]: 2025-12-11 06:15:29.998 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk --force-share --output=json" returned: 0 in 0.086s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.100 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'name': 'tempest-TestGettingAddress-server-1495480396', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000025', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'user_id': '60e9372de4754580913a836e11b9c248', 'hostId': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.101 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.103 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4d2c2d90-6514-4e53-b77f-30e376bcb3ab / tap63929a65-2f inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.104 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 4d2c2d90-6514-4e53-b77f-30e376bcb3ab / tap32cfa82e-ef inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.104 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.104 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f4476aa0-bc0f-4bbe-afa8-a508f4a05a1f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.101468', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb14239a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '0855ced982075a340a8658ba2a0580ca49c226424dc30faa608abc7d8a76694a'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.101468', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb14306a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '489b322cc326407e4a6d2d4c70582979003bfa8665739e1058e9d453d3af9195'}]}, 'timestamp': '2025-12-11 06:15:30.105061', '_unique_id': '8bb0286abd414fdcae435cc0a782219d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.135 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.135 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1a2e4480-f686-4e91-a3b8-d7cf78b5bb5f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.107008', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb18da5c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': 'a4579acca906591696bfdabdab98d73b032f3d8f425ee2efb282d6d8213ae41b'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.107008', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb18e8bc-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '416f71faf4dbd1a02c0cdac864eeea8e3e7a4eeb86abe322f1bac1b543c929a2'}]}, 'timestamp': '2025-12-11 06:15:30.136032', '_unique_id': 'e515874d8c46462b8301bebd6bb56ce9'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.137 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>]
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.138 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bd8ec4b2-8fa4-4b79-bccd-53688c6a5ba0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.138603', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb195a04-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '3e0b571b3a8ea7e40925b77dcf4de76be287edff276589ca90b52493a2eca6e4'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.138603', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb19662a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '307dcef76bb1006bc63b19ba74f4e9ec16fe6ba78bc210e3fb4e5ab31d808171'}]}, 'timestamp': '2025-12-11 06:15:30.139264', '_unique_id': '5c4d090d948249a992b47117677853cb'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.140 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.140 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.141 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>]
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.141 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.152 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.152 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7da6fb44-a3f0-4cd6-b84d-ee0ddbd36cfd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.141292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb1b7974-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': 'ebc47ad5548d6f28c1b3295324271ec723fcbdc07f62fa2ebc81d46da60067e9'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.141292', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb1b869e-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': '853302f99578ddb4bb8bbab797ce90d8e369121185a70ad6743eee0ed2ab49fd'}]}, 'timestamp': '2025-12-11 06:15:30.153169', '_unique_id': '7e0cd12ce0204136a23adf391b90155b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.154 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.155 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.155 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.155 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '28cceddc-6e15-4c64-8965-c0cb8ca056b2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.155159', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb1be8b4-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '009ca0acd3cbe2e7e7245d561e7ab668beb8edef2b1b95cbdc0b8476a490877f'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.155159', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb1bf4e4-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '403bb2fd6f4a3c44009a9df38d6fc7e8739b4914cc8ac7a47c772cdb19aa95e4'}]}, 'timestamp': '2025-12-11 06:15:30.156054', '_unique_id': 'c6dc5f4db1bf4c299804af505d9497cd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.156 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.157 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.157 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.158 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '497331ca-a969-495f-be68-b387785bcff6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.157700', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb1c43ae-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '3700a423534e40c4216d46a7b3466da032077c4e8aeeed88035194ed204554a1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.157700', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb1c4f52-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '3d05eebe40b541604a7016efd43e545ef6f613cff9add19b2776df4e0f2b6fce'}]}, 'timestamp': '2025-12-11 06:15:30.158340', '_unique_id': '4d8df89f641d441194e888f955444a18'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.159 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.160 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.160 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '36e6e3da-ee43-40e8-b793-0ca9247c550b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.159973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb1c9ca0-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': '1f7d3ca21f93745573a609eae2af9be651c8026b42b99e6eea5bc473eb3793f2'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.159973', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb1ca92a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': '48e20a9644ba1b5af1ee06229d45a01e2a3b1b78bc058f8966aadfa8bec279f5'}]}, 'timestamp': '2025-12-11 06:15:30.160604', '_unique_id': '6e414b98c16649f6a4a565d74e00e8b8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.161 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.162 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.162 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.162 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59db3ca5-788c-4d26-b5b3-1cb7ccb9f0b5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.162142', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb1cf164-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '21fa8fa0d528600b031b972d29f491930a15a916a7b507f3285f344bdc8879f8'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.162142', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb1cfe8e-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '2cd04d2096864e38c7e0030122c50bb051bec222ecc8c0ea0cd7ca3393fc6c40'}]}, 'timestamp': '2025-12-11 06:15:30.162801', '_unique_id': 'f1a3020584ca4ecd975c89ec542c90b6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.163 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.164 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.164 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.164 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9fdebad1-e22c-41e5-b412-1b58f9ca5d9b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.164641', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb1d538e-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '934efedd54fed62c56bf4134d95cd262aa6f9f5247dd035dbee0d554363f043c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.164641', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb1d5f14-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': 'c543e917991ffa44facb6b599f4556eb645494f1016004fc4a62a0ce744d01d2'}]}, 'timestamp': '2025-12-11 06:15:30.165259', '_unique_id': 'f9861b2d5a4b4d518dd26572e206e9f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.165 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.166 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.166 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.167 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0312fd9c-a35a-4e23-9efc-74cef41a7536', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.166845', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb1da91a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': 'fd8f392c057fee50afc85a7beb91f6bc8d2f81dfa15a542e58dedd08cf8ac57e'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.166845', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb1db694-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '059f5a6a4eacf0f107db7becb2074b1e68b38a9c122ded6122e3a06ab73a8fb1'}]}, 'timestamp': '2025-12-11 06:15:30.167535', '_unique_id': '30b0ae69af0347959b2f9ef8dc28ffff'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.168 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.169 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.169 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>]
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.169 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.170 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.170 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '435a0155-581e-4db7-954c-c187f6d8bc5e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.170072', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb1e2a52-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '3302df3070d221791698de6f7c0514395f74e3528be683212370658e63e96b65'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.170072', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb1e377c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '4ac047db15ae3401aeccceff00032cdacd595f2f4c3101917a2c6be719983a7f'}]}, 'timestamp': '2025-12-11 06:15:30.170803', '_unique_id': 'd1368a9a6a8d4c7c84f47dd9c66aaa4c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.172 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.172 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.172 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8623f40b-ca44-4dc0-840a-095f18962ffe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.172495', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb1e85b0-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '0a416cb7b6cd052b0eb9f2751d5512805e64de275b9215aec748c9dba7cb64cd'}, {'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.172495', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb1e917c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '2e41e4ccd498d1499791f47e9bbb7ffd6b6ddeac6b9f9a508526ce85cabdf5a2'}]}, 'timestamp': '2025-12-11 06:15:30.173115', '_unique_id': '01034546d1f94f7d84aeb5946675a87d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.173 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.174 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.192 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.192 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 4d2c2d90-6514-4e53-b77f-30e376bcb3ab: ceilometer.compute.pollsters.NoVolumeException
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.193 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.193 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a67fee94-45dc-4e13-bc87-51e64c52ca55', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.193251', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb21b2da-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '231723d87c49457cb92a4604a57f3645d74ca421d904b64e41eb733d21b975c4'}, {'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.193251', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb21bf0a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '88ee30c2dfda13123055aff2a6955dd047d13a49974bbea5afde0a6b3a857006'}]}, 'timestamp': '2025-12-11 06:15:30.193922', '_unique_id': 'b712271bc5364eaa9594a75b556a0d7d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.195 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.195 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'af2cdb20-cde3-4931-a464-4c9327627810', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.195492', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb22064a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': 'e7ef2018795fe1309d33e8f73b5e34dd73f12d17bffbe666ba6687c43b17c54a'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.195492', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb220e6a-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.843024883, 'message_signature': 'c19ea4ede44a8d535424329c7de530014deabaa3785acd35bc9c66b60a495eac'}]}, 'timestamp': '2025-12-11 06:15:30.195922', '_unique_id': '5555542dc39f4f129613c73b395f8467'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.196 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.197 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.197 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.197 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '82948ec4-0a75-4b9a-affd-18b6d3111826', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.197186', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb224966-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': 'd42185ce8395ea50885087bc31715a4cd9357cb2d54850bcd062a9fd130b0d50'}, {'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.197186', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb225442-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '898252ba95f30983da1b2f2a17656eb6d4b2305d9b743fdef0ec5b8515fbe173'}]}, 'timestamp': '2025-12-11 06:15:30.197743', '_unique_id': '224a898ac48e451f90038e1eac34b1e5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.199 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.latency volume: 154130155 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.199 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.latency volume: 441403 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fad9cc72-3108-4d7f-bd01-b671688e0014', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 154130155, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.199207', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb229826-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '04b561a5c50a647167a7c636b4d44d13edcabbbdd57a63251496d58637679ebb'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 441403, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.199207', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb22a30c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '4022bbba5947073214c411f067124c364541440ed67f722431d84fbc71fd1882'}]}, 'timestamp': '2025-12-11 06:15:30.199753', '_unique_id': 'a4f432c575a4444b9ae559ea4f08e41a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.201 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.201 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.201 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0af566ad-ab87-4427-ab03-61d61fdfba27', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.201208', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb22e5a6-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': 'd157841cd923888e013774e1e5bdeaffe66537c4b10a051b97bc1ffd604c87fd'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.201208', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb22f028-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '8df3af7c53d4e121e8d5016c33d6982ba0fb4cf5e1cee7fe92998a5e5ff035ce'}]}, 'timestamp': '2025-12-11 06:15:30.201720', '_unique_id': '988d910907c943599dede574e85fee29'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.202 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.203 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fd63885f-4acb-47c3-a5eb-18829d8370fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-vda', 'timestamp': '2025-12-11T06:15:30.202962', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'cb232a02-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': 'b2a403e0a80d09dc1ad7e49499782814257c1ee65919e644434cd1a3a0f13a56'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab-sda', 'timestamp': '2025-12-11T06:15:30.202962', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'cb2334a2-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.808680157, 'message_signature': '13aede110a456b9181a98dbdeac7d857b770f8fe3dd7048b327225c674d7decd'}]}, 'timestamp': '2025-12-11 06:15:30.203488', '_unique_id': 'd0daac2992f54743ab6c577295a36092'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.204 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/cpu volume: 2570000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '73e32519-9b31-4021-8047-421d520608ba', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2570000000, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'timestamp': '2025-12-11T06:15:30.204825', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'instance-00000025', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'cb23737c-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.894184987, 'message_signature': 'ff120e968269c6b81bf7e6c259b61bb286fa99cbae65b1c22c2b25ddb5ab768b'}]}, 'timestamp': '2025-12-11 06:15:30.205101', '_unique_id': 'd6f62b368a7d45fab6b88b3a667a9b7a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.206 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.206 12 DEBUG ceilometer.compute.pollsters [-] 4d2c2d90-6514-4e53-b77f-30e376bcb3ab/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d290a78-8aa6-44f5-8b48-e7946b683608', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap63929a65-2f', 'timestamp': '2025-12-11T06:15:30.206255', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap63929a65-2f', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:1c:f4:66', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap63929a65-2f'}, 'message_id': 'cb23ab76-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': '1bc0202989db28fffe74ad245310575116e08513d8696fe5ffaca8383ff835ae'}, {'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000025-4d2c2d90-6514-4e53-b77f-30e376bcb3ab-tap32cfa82e-ef', 'timestamp': '2025-12-11T06:15:30.206255', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-1495480396', 'name': 'tap32cfa82e-ef', 'instance_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:79:26:55', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap32cfa82e-ef'}, 'message_id': 'cb23b616-d658-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4004.803145426, 'message_signature': 'b47932dd1bcc551c9886703c75243b31d703064386ceff2de10fbd6c3d0428cf'}]}, 'timestamp': '2025-12-11 06:15:30.206800', '_unique_id': '7c60f4fb3b774e3f9c43fc8f3a614187'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.208 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.208 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:15:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:15:30.208 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-1495480396>]
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.215 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.217 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5498MB free_disk=73.29166412353516GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.217 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.217 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.348 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 4d2c2d90-6514-4e53-b77f-30e376bcb3ab actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.349 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.349 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.406 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.424 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.438 187132 DEBUG nova.compute.manager [req-85c697dc-57af-4302-9f64-02a6fa6cee43 req-6070d0bb-081d-459c-96ab-13f49f8883ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-deleted-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.485 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.485 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.268s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.597 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.773 187132 DEBUG nova.network.neutron [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updated VIF entry in instance network info cache for port 39751729-025a-4280-89aa-883712fc8dcb. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.773 187132 DEBUG nova.network.neutron [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Updating instance_info_cache with network_info: [{"id": "39751729-025a-4280-89aa-883712fc8dcb", "address": "fa:16:3e:34:76:eb", "network": {"id": "15a011b8-4d7f-4851-9aed-d01bb5a29d21", "bridge": "br-int", "label": "tempest-network-smoke--615454656", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.11", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "fce35ab888e44e46b3108813dcdf4163", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap39751729-02", "ovs_interfaceid": "39751729-025a-4280-89aa-883712fc8dcb", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.789 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-a64d006a-fa23-4538-a7c4-57160050b331" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.789 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.790 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.790 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.790 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.790 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.791 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-unplugged-39751729-025a-4280-89aa-883712fc8dcb for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.791 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.791 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "a64d006a-fa23-4538-a7c4-57160050b331-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.792 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.792 187132 DEBUG oslo_concurrency.lockutils [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "a64d006a-fa23-4538-a7c4-57160050b331-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.792 187132 DEBUG nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] No waiting events found dispatching network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:30 np0005554845 nova_compute[187128]: 2025-12-11 06:15:30.792 187132 WARNING nova.compute.manager [req-6ae8eada-d845-4e27-8a70-6f3d94badc66 req-c189b534-92bb-486d-87dd-ce877a560904 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Received unexpected event network-vif-plugged-39751729-025a-4280-89aa-883712fc8dcb for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.480 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.480 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.481 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.481 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.716 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:15:31 np0005554845 nova_compute[187128]: 2025-12-11 06:15:31.716 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 4d2c2d90-6514-4e53-b77f-30e376bcb3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:32 np0005554845 nova_compute[187128]: 2025-12-11 06:15:32.054 187132 DEBUG nova.compute.manager [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-changed-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:32 np0005554845 nova_compute[187128]: 2025-12-11 06:15:32.055 187132 DEBUG nova.compute.manager [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing instance network info cache due to event network-changed-63929a65-2f0b-481d-982f-4101b7879484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:32 np0005554845 nova_compute[187128]: 2025-12-11 06:15:32.055 187132 DEBUG oslo_concurrency.lockutils [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:32 np0005554845 podman[221771]: 2025-12-11 06:15:32.114599147 +0000 UTC m=+0.050089816 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:15:32 np0005554845 nova_compute[187128]: 2025-12-11 06:15:32.546 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:33Z|00240|binding|INFO|Releasing lport e6bf5322-feeb-40ef-a479-f4418af08bd9 from this chassis (sb_readonly=0)
Dec 11 01:15:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:33Z|00241|binding|INFO|Releasing lport f9fad358-fd40-4b95-a136-3d0381c2bd29 from this chassis (sb_readonly=0)
Dec 11 01:15:33 np0005554845 nova_compute[187128]: 2025-12-11 06:15:33.954 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.086 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.111 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.112 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.112 187132 DEBUG oslo_concurrency.lockutils [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.113 187132 DEBUG nova.network.neutron [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing network info cache for port 63929a65-2f0b-481d-982f-4101b7879484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.115 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.116 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.116 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.116 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.117 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.601 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:35 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:35Z|00242|binding|INFO|Releasing lport e6bf5322-feeb-40ef-a479-f4418af08bd9 from this chassis (sb_readonly=0)
Dec 11 01:15:35 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:35Z|00243|binding|INFO|Releasing lport f9fad358-fd40-4b95-a136-3d0381c2bd29 from this chassis (sb_readonly=0)
Dec 11 01:15:35 np0005554845 nova_compute[187128]: 2025-12-11 06:15:35.743 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:36 np0005554845 podman[221791]: 2025-12-11 06:15:36.127482079 +0000 UTC m=+0.056299186 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:15:36 np0005554845 podman[221792]: 2025-12-11 06:15:36.131968641 +0000 UTC m=+0.055634017 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.buildah.version=1.33.7, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, architecture=x86_64)
Dec 11 01:15:36 np0005554845 nova_compute[187128]: 2025-12-11 06:15:36.764 187132 DEBUG nova.network.neutron [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updated VIF entry in instance network info cache for port 63929a65-2f0b-481d-982f-4101b7879484. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:36 np0005554845 nova_compute[187128]: 2025-12-11 06:15:36.764 187132 DEBUG nova.network.neutron [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:36 np0005554845 nova_compute[187128]: 2025-12-11 06:15:36.796 187132 DEBUG oslo_concurrency.lockutils [req-1235ddcb-7a02-4267-9559-183662f113ac req-93fe4f30-5e28-496d-97e3-67c15c1e3b52 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:37 np0005554845 nova_compute[187128]: 2025-12-11 06:15:37.609 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:37 np0005554845 nova_compute[187128]: 2025-12-11 06:15:37.695 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:37 np0005554845 nova_compute[187128]: 2025-12-11 06:15:37.696 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:15:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:38.066 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=14, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=13) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:38 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:38.067 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:15:38 np0005554845 nova_compute[187128]: 2025-12-11 06:15:38.069 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:39Z|00032|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:1c:f4:66 10.100.0.5
Dec 11 01:15:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:39Z|00033|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:1c:f4:66 10.100.0.5
Dec 11 01:15:40 np0005554845 nova_compute[187128]: 2025-12-11 06:15:40.603 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:42 np0005554845 nova_compute[187128]: 2025-12-11 06:15:42.403 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:42 np0005554845 nova_compute[187128]: 2025-12-11 06:15:42.541 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433727.506289, a64d006a-fa23-4538-a7c4-57160050b331 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:15:42 np0005554845 nova_compute[187128]: 2025-12-11 06:15:42.542 187132 INFO nova.compute.manager [-] [instance: a64d006a-fa23-4538-a7c4-57160050b331] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:15:42 np0005554845 nova_compute[187128]: 2025-12-11 06:15:42.560 187132 DEBUG nova.compute.manager [None req-8f10afd0-f768-4ec3-9c83-77735ffd1bc0 - - - - - -] [instance: a64d006a-fa23-4538-a7c4-57160050b331] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:15:42 np0005554845 nova_compute[187128]: 2025-12-11 06:15:42.611 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:43 np0005554845 nova_compute[187128]: 2025-12-11 06:15:43.070 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:45.070 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '14'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:45 np0005554845 nova_compute[187128]: 2025-12-11 06:15:45.605 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:47 np0005554845 nova_compute[187128]: 2025-12-11 06:15:47.613 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:49 np0005554845 nova_compute[187128]: 2025-12-11 06:15:49.666 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.024 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.025 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.025 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.026 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.026 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.027 187132 INFO nova.compute.manager [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Terminating instance#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.029 187132 DEBUG nova.compute.manager [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:15:50 np0005554845 kernel: tap63929a65-2f (unregistering): left promiscuous mode
Dec 11 01:15:50 np0005554845 NetworkManager[55529]: <info>  [1765433750.0627] device (tap63929a65-2f): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00244|binding|INFO|Releasing lport 63929a65-2f0b-481d-982f-4101b7879484 from this chassis (sb_readonly=0)
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00245|binding|INFO|Setting lport 63929a65-2f0b-481d-982f-4101b7879484 down in Southbound
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.071 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00246|binding|INFO|Removing iface tap63929a65-2f ovn-installed in OVS
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.073 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.079 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:1c:f4:66 10.100.0.5'], port_security=['fa:16:3e:1c:f4:66 10.100.0.5'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.5/28', 'neutron:device_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70dc7c03-2005-47cf-a898-b31c3c862049', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '74c01a76-4421-4dd2-a8ba-7cd22c52b13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0310888c-82ff-49ca-8771-e30be2399d81, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=63929a65-2f0b-481d-982f-4101b7879484) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.081 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 63929a65-2f0b-481d-982f-4101b7879484 in datapath 70dc7c03-2005-47cf-a898-b31c3c862049 unbound from our chassis#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.083 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70dc7c03-2005-47cf-a898-b31c3c862049, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.084 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a0a4b152-1642-48e2-afa5-e0d356fda0a3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.084 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049 namespace which is not needed anymore#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.090 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 kernel: tap32cfa82e-ef (unregistering): left promiscuous mode
Dec 11 01:15:50 np0005554845 NetworkManager[55529]: <info>  [1765433750.0991] device (tap32cfa82e-ef): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.099 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00247|binding|INFO|Releasing lport 32cfa82e-ef0b-43a6-9378-5eec85606901 from this chassis (sb_readonly=0)
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00248|binding|INFO|Setting lport 32cfa82e-ef0b-43a6-9378-5eec85606901 down in Southbound
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.106 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_controller[95428]: 2025-12-11T06:15:50Z|00249|binding|INFO|Removing iface tap32cfa82e-ef ovn-installed in OVS
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.108 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.114 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:26:55 2001:db8::f816:3eff:fe79:2655'], port_security=['fa:16:3e:79:26:55 2001:db8::f816:3eff:fe79:2655'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '2001:db8::f816:3eff:fe79:2655/64', 'neutron:device_id': '4d2c2d90-6514-4e53-b77f-30e376bcb3ab', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '74c01a76-4421-4dd2-a8ba-7cd22c52b13e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=c87d4ad4-67f8-4533-bf56-ecd743daead8, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=32cfa82e-ef0b-43a6-9378-5eec85606901) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.118 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.128 187132 DEBUG nova.compute.manager [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-changed-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.128 187132 DEBUG nova.compute.manager [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing instance network info cache due to event network-changed-63929a65-2f0b-481d-982f-4101b7879484. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.128 187132 DEBUG oslo_concurrency.lockutils [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.129 187132 DEBUG oslo_concurrency.lockutils [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.129 187132 DEBUG nova.network.neutron [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Refreshing network info cache for port 63929a65-2f0b-481d-982f-4101b7879484 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:15:50 np0005554845 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Deactivated successfully.
Dec 11 01:15:50 np0005554845 systemd[1]: machine-qemu\x2d17\x2dinstance\x2d00000025.scope: Consumed 13.730s CPU time.
Dec 11 01:15:50 np0005554845 systemd-machined[153381]: Machine qemu-17-instance-00000025 terminated.
Dec 11 01:15:50 np0005554845 NetworkManager[55529]: <info>  [1765433750.2609] manager: (tap32cfa82e-ef): new Tun device (/org/freedesktop/NetworkManager/Devices/125)
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.312 187132 INFO nova.virt.libvirt.driver [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Instance destroyed successfully.#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.314 187132 DEBUG nova.objects.instance [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid 4d2c2d90-6514-4e53-b77f-30e376bcb3ab obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [NOTICE]   (221551) : haproxy version is 2.8.14-c23fe91
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [NOTICE]   (221551) : path to executable is /usr/sbin/haproxy
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [WARNING]  (221551) : Exiting Master process...
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [WARNING]  (221551) : Exiting Master process...
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [ALERT]    (221551) : Current worker (221556) exited with code 143 (Terminated)
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049[221531]: [WARNING]  (221551) : All workers exited. Exiting... (0)
Dec 11 01:15:50 np0005554845 systemd[1]: libpod-b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7.scope: Deactivated successfully.
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.326 187132 DEBUG nova.virt.libvirt.vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:15:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:15:27Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.327 187132 DEBUG nova.network.os_vif_util [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.327 187132 DEBUG nova.network.os_vif_util [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.328 187132 DEBUG os_vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:15:50 np0005554845 podman[221876]: 2025-12-11 06:15:50.328608016 +0000 UTC m=+0.162434431 container died b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.329 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.330 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap63929a65-2f, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.373 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.375 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.377 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.379 187132 INFO os_vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:1c:f4:66,bridge_name='br-int',has_traffic_filtering=True,id=63929a65-2f0b-481d-982f-4101b7879484,network=Network(70dc7c03-2005-47cf-a898-b31c3c862049),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap63929a65-2f')#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.381 187132 DEBUG nova.virt.libvirt.vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:15:08Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-1495480396',display_name='tempest-TestGettingAddress-server-1495480396',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-1495480396',id=37,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPxE6a7qoajtcVEApESceqdLqWS0grIVgvnvU7MQ5/B+0xinwqYVq27IHbe1pAlSX1R75zTl3qrHhAuAFc+Wdv5POSdffVcY3xpsVHBpr0U5d8WbecqL00KPsjsUVxxADA==',key_name='tempest-TestGettingAddress-1338945651',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:15:27Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-et3g81w5',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:15:27Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=4d2c2d90-6514-4e53-b77f-30e376bcb3ab,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.381 187132 DEBUG nova.network.os_vif_util [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.382 187132 DEBUG nova.network.os_vif_util [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.382 187132 DEBUG os_vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.385 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.386 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap32cfa82e-ef, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.387 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.389 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.391 187132 INFO os_vif [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:26:55,bridge_name='br-int',has_traffic_filtering=True,id=32cfa82e-ef0b-43a6-9378-5eec85606901,network=Network(5fd5f2b9-1570-4922-9d37-b3acee2aa306),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap32cfa82e-ef')#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.391 187132 INFO nova.virt.libvirt.driver [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Deleting instance files /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab_del#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.392 187132 INFO nova.virt.libvirt.driver [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Deletion of /var/lib/nova/instances/4d2c2d90-6514-4e53-b77f-30e376bcb3ab_del complete#033[00m
Dec 11 01:15:50 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7-userdata-shm.mount: Deactivated successfully.
Dec 11 01:15:50 np0005554845 systemd[1]: var-lib-containers-storage-overlay-7bd4cf718ec006296757ce05315646fa2a1bf95b2c99308ce08ab43eeae15849-merged.mount: Deactivated successfully.
Dec 11 01:15:50 np0005554845 podman[221876]: 2025-12-11 06:15:50.400004985 +0000 UTC m=+0.233831410 container cleanup b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:15:50 np0005554845 systemd[1]: libpod-conmon-b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7.scope: Deactivated successfully.
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.443 187132 INFO nova.compute.manager [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Took 0.41 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.444 187132 DEBUG oslo.service.loopingcall [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.444 187132 DEBUG nova.compute.manager [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.445 187132 DEBUG nova.network.neutron [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:15:50 np0005554845 podman[221935]: 2025-12-11 06:15:50.463966482 +0000 UTC m=+0.043776999 container remove b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_managed=true)
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.468 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[20d8fca8-ff96-4946-8d86-23f098cac726]: (4, ('Thu Dec 11 06:15:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049 (b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7)\nb10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7\nThu Dec 11 06:15:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049 (b10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7)\nb10c2a9c5ed35a488cbcecd3d967243fa1291fcca47ad3dedd96d1999e3cdbb7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.470 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[67090084-680a-4fa0-bd51-340bd740ac58]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.471 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70dc7c03-20, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.472 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 kernel: tap70dc7c03-20: left promiscuous mode
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.483 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.485 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[248e477a-2392-4c71-9295-4caa69077412]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.498 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[579e4e0b-4f09-4675-a9a5-9d773910d513]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.499 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6bd3e78e-0a12-4781-aa89-47ed4569cf0f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.512 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2159f7b5-c488-4356-81c5-24d5b548bc4d]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400022, 'reachable_time': 35173, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 221950, 'error': None, 'target': 'ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.514 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-70dc7c03-2005-47cf-a898-b31c3c862049 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.514 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[930a5eea-9d5e-4013-a0b4-41532175a54f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.515 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 32cfa82e-ef0b-43a6-9378-5eec85606901 in datapath 5fd5f2b9-1570-4922-9d37-b3acee2aa306 unbound from our chassis#033[00m
Dec 11 01:15:50 np0005554845 systemd[1]: run-netns-ovnmeta\x2d70dc7c03\x2d2005\x2d47cf\x2da898\x2db31c3c862049.mount: Deactivated successfully.
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.516 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5fd5f2b9-1570-4922-9d37-b3acee2aa306, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.517 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[de03b658-8ccc-413d-be07-eea6a778ed6e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.517 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306 namespace which is not needed anymore#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.606 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [NOTICE]   (221625) : haproxy version is 2.8.14-c23fe91
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [NOTICE]   (221625) : path to executable is /usr/sbin/haproxy
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [WARNING]  (221625) : Exiting Master process...
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [ALERT]    (221625) : Current worker (221627) exited with code 143 (Terminated)
Dec 11 01:15:50 np0005554845 neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306[221621]: [WARNING]  (221625) : All workers exited. Exiting... (0)
Dec 11 01:15:50 np0005554845 systemd[1]: libpod-5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a.scope: Deactivated successfully.
Dec 11 01:15:50 np0005554845 podman[221968]: 2025-12-11 06:15:50.648667538 +0000 UTC m=+0.039460403 container died 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:15:50 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a-userdata-shm.mount: Deactivated successfully.
Dec 11 01:15:50 np0005554845 systemd[1]: var-lib-containers-storage-overlay-3da898272f46ab7b3f3aadf00b44d6d92647b9d8e6d81fe1c77f543d417c805e-merged.mount: Deactivated successfully.
Dec 11 01:15:50 np0005554845 podman[221968]: 2025-12-11 06:15:50.689018143 +0000 UTC m=+0.079810978 container cleanup 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:15:50 np0005554845 systemd[1]: libpod-conmon-5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a.scope: Deactivated successfully.
Dec 11 01:15:50 np0005554845 podman[221997]: 2025-12-11 06:15:50.752040615 +0000 UTC m=+0.044623673 container remove 5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202)
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.757 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[10316d10-abe0-48da-98ab-4edc31f87c1a]: (4, ('Thu Dec 11 06:15:50 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306 (5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a)\n5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a\nThu Dec 11 06:15:50 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306 (5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a)\n5379846bef49b50eacce5175a299ffbadc903a124390747e2a5b5ee201f2463a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.758 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fca0bffd-a6aa-41e4-9445-e4f2973b498f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.760 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5fd5f2b9-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.761 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 kernel: tap5fd5f2b9-10: left promiscuous mode
Dec 11 01:15:50 np0005554845 nova_compute[187128]: 2025-12-11 06:15:50.773 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.775 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d41a618f-a935-4444-aa3f-f7a6f1c28c98]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.800 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c9d7f188-51e7-4205-896c-cbe2bea0c1a0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.802 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[df3ce0c8-4410-44c5-939e-f2223e7abc1f]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.822 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7d448d80-83e0-43bf-a374-2cda35267a58]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 400109, 'reachable_time': 34332, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222014, 'error': None, 'target': 'ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.824 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5fd5f2b9-1570-4922-9d37-b3acee2aa306 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:15:50 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:15:50.825 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[ac6f843c-639c-4343-9add-47ccab285975]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:15:51 np0005554845 podman[222015]: 2025-12-11 06:15:51.374912897 +0000 UTC m=+0.053288818 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:15:51 np0005554845 systemd[1]: run-netns-ovnmeta\x2d5fd5f2b9\x2d1570\x2d4922\x2d9d37\x2db3acee2aa306.mount: Deactivated successfully.
Dec 11 01:15:51 np0005554845 nova_compute[187128]: 2025-12-11 06:15:51.657 187132 DEBUG nova.compute.manager [req-3d3dda65-7799-4217-81a4-d12422c09a98 req-bfe8fae0-932e-47b4-8f70-d69394202377 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-deleted-32cfa82e-ef0b-43a6-9378-5eec85606901 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:51 np0005554845 nova_compute[187128]: 2025-12-11 06:15:51.658 187132 INFO nova.compute.manager [req-3d3dda65-7799-4217-81a4-d12422c09a98 req-bfe8fae0-932e-47b4-8f70-d69394202377 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Neutron deleted interface 32cfa82e-ef0b-43a6-9378-5eec85606901; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 01:15:51 np0005554845 nova_compute[187128]: 2025-12-11 06:15:51.659 187132 DEBUG nova.network.neutron [req-3d3dda65-7799-4217-81a4-d12422c09a98 req-bfe8fae0-932e-47b4-8f70-d69394202377 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.232", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:51 np0005554845 nova_compute[187128]: 2025-12-11 06:15:51.684 187132 DEBUG nova.compute.manager [req-3d3dda65-7799-4217-81a4-d12422c09a98 req-bfe8fae0-932e-47b4-8f70-d69394202377 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Detach interface failed, port_id=32cfa82e-ef0b-43a6-9378-5eec85606901, reason: Instance 4d2c2d90-6514-4e53-b77f-30e376bcb3ab could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.003 187132 DEBUG nova.network.neutron [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updated VIF entry in instance network info cache for port 63929a65-2f0b-481d-982f-4101b7879484. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.005 187132 DEBUG nova.network.neutron [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [{"id": "63929a65-2f0b-481d-982f-4101b7879484", "address": "fa:16:3e:1c:f4:66", "network": {"id": "70dc7c03-2005-47cf-a898-b31c3c862049", "bridge": "br-int", "label": "tempest-network-smoke--1367324066", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.5", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap63929a65-2f", "ovs_interfaceid": "63929a65-2f0b-481d-982f-4101b7879484", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}, {"id": "32cfa82e-ef0b-43a6-9378-5eec85606901", "address": "fa:16:3e:79:26:55", "network": {"id": "5fd5f2b9-1570-4922-9d37-b3acee2aa306", "bridge": "br-int", "label": "tempest-network-smoke--1735006857", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe79:2655", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap32cfa82e-ef", "ovs_interfaceid": "32cfa82e-ef0b-43a6-9378-5eec85606901", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.054 187132 DEBUG oslo_concurrency.lockutils [req-ab543c45-f0d2-4d2a-85a0-37e039cc907f req-e827ddd2-80f5-441d-bdc2-ed6d73d7b20d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-4d2c2d90-6514-4e53-b77f-30e376bcb3ab" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.093 187132 DEBUG nova.network.neutron [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.109 187132 INFO nova.compute.manager [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Took 1.66 seconds to deallocate network for instance.#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.158 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.158 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.210 187132 DEBUG nova.compute.provider_tree [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.222 187132 DEBUG nova.scheduler.client.report [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.246 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.088s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.276 187132 INFO nova.scheduler.client.report [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance 4d2c2d90-6514-4e53-b77f-30e376bcb3ab#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.364 187132 DEBUG oslo_concurrency.lockutils [None req-0ddd808b-f868-43cb-9892-efe2390e385a 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.339s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.383 187132 DEBUG nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-unplugged-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.384 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.384 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.384 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.385 187132 DEBUG nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] No waiting events found dispatching network-vif-unplugged-63929a65-2f0b-481d-982f-4101b7879484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.385 187132 WARNING nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received unexpected event network-vif-unplugged-63929a65-2f0b-481d-982f-4101b7879484 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.385 187132 DEBUG nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.386 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.386 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.386 187132 DEBUG oslo_concurrency.lockutils [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "4d2c2d90-6514-4e53-b77f-30e376bcb3ab-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.386 187132 DEBUG nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] No waiting events found dispatching network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:15:52 np0005554845 nova_compute[187128]: 2025-12-11 06:15:52.387 187132 WARNING nova.compute.manager [req-01f6d49d-47d0-4ed0-bc21-1674cdc8fba1 req-335f295e-3453-4a64-a181-d7ed5656f2ed eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received unexpected event network-vif-plugged-63929a65-2f0b-481d-982f-4101b7879484 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.609 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.610 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.626 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.713 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.713 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.726 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.726 187132 INFO nova.compute.claims [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.765 187132 DEBUG nova.compute.manager [req-162a6a71-993f-4ebb-a26c-07785ce45e0f req-8a566447-6b2a-482a-99f5-66f361a5a96b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Received event network-vif-deleted-63929a65-2f0b-481d-982f-4101b7879484 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.904 187132 DEBUG nova.compute.provider_tree [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.945 187132 DEBUG nova.scheduler.client.report [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.977 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.263s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:53 np0005554845 nova_compute[187128]: 2025-12-11 06:15:53.978 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.038 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.038 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.059 187132 INFO nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.080 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.200 187132 DEBUG nova.policy [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '40cb523bfe1e4484bb2e91c903500c97', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'project_domain_id': 'default', 'roles': ['member', 'reader'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.649 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.650 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.650 187132 INFO nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Creating image(s)#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.651 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.651 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.652 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.670 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.741 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.071s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.743 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.745 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.774 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.829 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.831 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.871 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk 1073741824" returned: 0 in 0.040s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.872 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.127s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.872 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.935 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.939 187132 DEBUG nova.virt.disk.api [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Checking if we can resize image /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:15:54 np0005554845 nova_compute[187128]: 2025-12-11 06:15:54.940 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.014 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json" returned: 0 in 0.074s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.015 187132 DEBUG nova.virt.disk.api [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Cannot resize image /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.016 187132 DEBUG nova.objects.instance [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'migration_context' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.047 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.048 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Ensure instance console log exists: /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.049 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.049 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.050 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.389 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:55 np0005554845 nova_compute[187128]: 2025-12-11 06:15:55.608 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:15:56 np0005554845 nova_compute[187128]: 2025-12-11 06:15:56.998 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Successfully created port: 50100b7e-dec5-41e3-a26a-a1b50420a48f _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:15:57 np0005554845 podman[222055]: 2025-12-11 06:15:57.16239603 +0000 UTC m=+0.078594385 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm)
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.090 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Successfully updated port: 50100b7e-dec5-41e3-a26a-a1b50420a48f _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.103 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.103 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.104 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.258 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.512 187132 DEBUG nova.compute.manager [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.513 187132 DEBUG nova.compute.manager [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing instance network info cache due to event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:15:59 np0005554845 nova_compute[187128]: 2025-12-11 06:15:59.513 187132 DEBUG oslo_concurrency.lockutils [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:16:00 np0005554845 podman[222077]: 2025-12-11 06:16:00.115164969 +0000 UTC m=+0.052802275 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.build-date=20251202, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:16:00 np0005554845 podman[222078]: 2025-12-11 06:16:00.155574237 +0000 UTC m=+0.085114683 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 01:16:00 np0005554845 nova_compute[187128]: 2025-12-11 06:16:00.391 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:00 np0005554845 nova_compute[187128]: 2025-12-11 06:16:00.611 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.630 187132 DEBUG nova.network.neutron [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.656 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.656 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance network_info: |[{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.657 187132 DEBUG oslo_concurrency.lockutils [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.658 187132 DEBUG nova.network.neutron [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing network info cache for port 50100b7e-dec5-41e3-a26a-a1b50420a48f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.664 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Start _get_guest_xml network_info=[{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.671 187132 WARNING nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.676 187132 DEBUG nova.virt.libvirt.host [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.677 187132 DEBUG nova.virt.libvirt.host [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.686 187132 DEBUG nova.virt.libvirt.host [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.687 187132 DEBUG nova.virt.libvirt.host [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.689 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.689 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.690 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.690 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.691 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.691 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.691 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.692 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.693 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.693 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.693 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.694 187132 DEBUG nova.virt.hardware [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.699 187132 DEBUG nova.virt.libvirt.vif [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-867218591',display_name='tempest-TestNetworkAdvancedServerOps-server-867218591',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-867218591',id=39,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL0AMCjhpLgAYbyW4RbNIsnfsfUIy5k9j3OYnQ76lUxwzuqHt5hm5bLSBBEruJUcifDTHX+MUrSG1utS0WCewPMauEm/KwEP/RhnwGQJFj9Xrh5egfQsGN9ZRRoEZNVLpQ==',key_name='tempest-TestNetworkAdvancedServerOps-1189574564',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-m03msnke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:54Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e8fcccda-d2bc-4e5a-b478-e526a1d2662c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.700 187132 DEBUG nova.network.os_vif_util [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.700 187132 DEBUG nova.network.os_vif_util [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.701 187132 DEBUG nova.objects.instance [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.731 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <uuid>e8fcccda-d2bc-4e5a-b478-e526a1d2662c</uuid>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <name>instance-00000027</name>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestNetworkAdvancedServerOps-server-867218591</nova:name>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:16:01</nova:creationTime>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:user uuid="40cb523bfe1e4484bb2e91c903500c97">tempest-TestNetworkAdvancedServerOps-369129245-project-member</nova:user>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:project uuid="3ec4c03cd7274517b88d9087ad4cbd83">tempest-TestNetworkAdvancedServerOps-369129245</nova:project>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        <nova:port uuid="50100b7e-dec5-41e3-a26a-a1b50420a48f">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.14" ipVersion="4"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="serial">e8fcccda-d2bc-4e5a-b478-e526a1d2662c</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="uuid">e8fcccda-d2bc-4e5a-b478-e526a1d2662c</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.config"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:c6:2b:0e"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <target dev="tap50100b7e-de"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/console.log" append="off"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:16:01 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:16:01 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:16:01 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:16:01 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.733 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Preparing to wait for external event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.733 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.733 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.733 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.734 187132 DEBUG nova.virt.libvirt.vif [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-867218591',display_name='tempest-TestNetworkAdvancedServerOps-server-867218591',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-867218591',id=39,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL0AMCjhpLgAYbyW4RbNIsnfsfUIy5k9j3OYnQ76lUxwzuqHt5hm5bLSBBEruJUcifDTHX+MUrSG1utS0WCewPMauEm/KwEP/RhnwGQJFj9Xrh5egfQsGN9ZRRoEZNVLpQ==',key_name='tempest-TestNetworkAdvancedServerOps-1189574564',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-m03msnke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:15:54Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e8fcccda-d2bc-4e5a-b478-e526a1d2662c,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.734 187132 DEBUG nova.network.os_vif_util [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.735 187132 DEBUG nova.network.os_vif_util [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.735 187132 DEBUG os_vif [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.735 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.736 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.736 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.738 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.738 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50100b7e-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.738 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50100b7e-de, col_values=(('external_ids', {'iface-id': '50100b7e-dec5-41e3-a26a-a1b50420a48f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:2b:0e', 'vm-uuid': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.740 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:01 np0005554845 NetworkManager[55529]: <info>  [1765433761.7412] manager: (tap50100b7e-de): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/126)
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.742 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.746 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.747 187132 INFO os_vif [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de')#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.824 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.825 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.825 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] No VIF found with MAC fa:16:3e:c6:2b:0e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:16:01 np0005554845 nova_compute[187128]: 2025-12-11 06:16:01.826 187132 INFO nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Using config drive#033[00m
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.761 187132 INFO nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Creating config drive at /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.config#033[00m
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.765 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1_epyjdk execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.898 187132 DEBUG oslo_concurrency.processutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp1_epyjdk" returned: 0 in 0.133s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:16:02 np0005554845 kernel: tap50100b7e-de: entered promiscuous mode
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.966 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:02Z|00250|binding|INFO|Claiming lport 50100b7e-dec5-41e3-a26a-a1b50420a48f for this chassis.
Dec 11 01:16:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:02Z|00251|binding|INFO|50100b7e-dec5-41e3-a26a-a1b50420a48f: Claiming fa:16:3e:c6:2b:0e 10.100.0.14
Dec 11 01:16:02 np0005554845 NetworkManager[55529]: <info>  [1765433762.9680] manager: (tap50100b7e-de): new Tun device (/org/freedesktop/NetworkManager/Devices/127)
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.977 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:2b:0e 10.100.0.14'], port_security=['fa:16:3e:c6:2b:0e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '2', 'neutron:security_group_ids': '78eb39e7-31de-4384-bd28-5cab9069ce58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfe2697e-ebea-497a-b5ca-c1d428531e27, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=50100b7e-dec5-41e3-a26a-a1b50420a48f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.978 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 50100b7e-dec5-41e3-a26a-a1b50420a48f in datapath 5f7d50c2-325f-481b-ab67-1c19b7285e1a bound to our chassis#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.980 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5f7d50c2-325f-481b-ab67-1c19b7285e1a#033[00m
Dec 11 01:16:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:02Z|00252|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f ovn-installed in OVS
Dec 11 01:16:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:02Z|00253|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f up in Southbound
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.984 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:02 np0005554845 nova_compute[187128]: 2025-12-11 06:16:02.987 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.995 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5bd56736-84ab-4e8c-9f16-b9608d3f4f0d]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.995 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5f7d50c2-31 in ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.998 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5f7d50c2-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.998 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ad573b00-63b6-468c-8d4c-29f2fcb80b99]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:02 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:02.999 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[203eb967-48aa-4627-bfee-7bd1a8c35420]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 systemd-udevd[222153]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:16:03 np0005554845 systemd-machined[153381]: New machine qemu-18-instance-00000027.
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.014 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[8e1a4fdf-b21d-4e95-942c-0f852e000446]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 NetworkManager[55529]: <info>  [1765433763.0170] device (tap50100b7e-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:16:03 np0005554845 NetworkManager[55529]: <info>  [1765433763.0177] device (tap50100b7e-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:16:03 np0005554845 systemd[1]: Started Virtual Machine qemu-18-instance-00000027.
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.036 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[547cb2e0-2361-4dc2-b720-28b24a8f124a]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 podman[222135]: 2025-12-11 06:16:03.038445188 +0000 UTC m=+0.073814016 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.064 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[afa0e972-9ece-4a40-8dd2-d580a9d605ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.070 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e30e1f64-3e2d-46a3-828c-c4f2b364a003]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 NetworkManager[55529]: <info>  [1765433763.0719] manager: (tap5f7d50c2-30): new Veth device (/org/freedesktop/NetworkManager/Devices/128)
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.105 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[1f4fe7b2-7163-4487-be37-d7abaf2782c5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.109 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[db6876e0-c5e7-46cc-b513-1f4b71fa65cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 NetworkManager[55529]: <info>  [1765433763.1341] device (tap5f7d50c2-30): carrier: link connected
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.144 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[5c8b6c60-2a6f-4181-849c-f2c58225905f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.160 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[caa0fa1e-fbd8-48c8-a8bd-99b6f767551d]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f7d50c2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:71:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403778, 'reachable_time': 35460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222194, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.176 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[26b4971b-55c0-4e74-8efe-98fdadcff9cd]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:71ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 403778, 'tstamp': 403778}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222195, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.193 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bd4b4c03-d500-49f9-ac49-087c74108476]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f7d50c2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:71:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 76], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403778, 'reachable_time': 35460, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222196, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.223 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[08128ea1-b7bb-4d59-840c-2c8f89920b41]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.276 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bc3ac0bd-4ed4-47e1-b55a-102c449c277c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.281 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f7d50c2-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.281 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.282 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f7d50c2-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:03 np0005554845 kernel: tap5f7d50c2-30: entered promiscuous mode
Dec 11 01:16:03 np0005554845 NetworkManager[55529]: <info>  [1765433763.2846] manager: (tap5f7d50c2-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/129)
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.287 187132 DEBUG nova.compute.manager [req-ad95e8cb-bbd2-430b-b5b4-f4dc9d3f371f req-1441aabb-01e5-4407-9ad4-9b9d94634a38 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.287 187132 DEBUG oslo_concurrency.lockutils [req-ad95e8cb-bbd2-430b-b5b4-f4dc9d3f371f req-1441aabb-01e5-4407-9ad4-9b9d94634a38 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.287 187132 DEBUG oslo_concurrency.lockutils [req-ad95e8cb-bbd2-430b-b5b4-f4dc9d3f371f req-1441aabb-01e5-4407-9ad4-9b9d94634a38 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.288 187132 DEBUG oslo_concurrency.lockutils [req-ad95e8cb-bbd2-430b-b5b4-f4dc9d3f371f req-1441aabb-01e5-4407-9ad4-9b9d94634a38 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.288 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5f7d50c2-30, col_values=(('external_ids', {'iface-id': '825c1ff3-5669-4fe9-9c6a-2a8b74e06612'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.288 187132 DEBUG nova.compute.manager [req-ad95e8cb-bbd2-430b-b5b4-f4dc9d3f371f req-1441aabb-01e5-4407-9ad4-9b9d94634a38 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Processing event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:16:03 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:03Z|00254|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.289 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.290 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.292 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[85389e56-4688-4670-9983-4c2c6e24f4e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.292 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-5f7d50c2-325f-481b-ab67-1c19b7285e1a
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 5f7d50c2-325f-481b-ab67-1c19b7285e1a
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:16:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:03.293 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'env', 'PROCESS_TAG=haproxy-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5f7d50c2-325f-481b-ab67-1c19b7285e1a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:16:03 np0005554845 nova_compute[187128]: 2025-12-11 06:16:03.300 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:03 np0005554845 podman[222228]: 2025-12-11 06:16:03.679507825 +0000 UTC m=+0.058788307 container create 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:16:03 np0005554845 systemd[1]: Started libpod-conmon-5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba.scope.
Dec 11 01:16:03 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:16:03 np0005554845 podman[222228]: 2025-12-11 06:16:03.643852447 +0000 UTC m=+0.023132949 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:16:03 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a4e7aaccc5ce4491281bd5afe23a3f51d0acc5b5ba3691ce921717a93de32956/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:16:03 np0005554845 podman[222228]: 2025-12-11 06:16:03.754624664 +0000 UTC m=+0.133905156 container init 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:16:03 np0005554845 podman[222228]: 2025-12-11 06:16:03.760583706 +0000 UTC m=+0.139864178 container start 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:16:03 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [NOTICE]   (222247) : New worker (222249) forked
Dec 11 01:16:03 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [NOTICE]   (222247) : Loading success.
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.530 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433764.5301888, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.531 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Started (Lifecycle Event)#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.534 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.539 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.543 187132 INFO nova.virt.libvirt.driver [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance spawned successfully.#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.544 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.561 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.569 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.576 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.577 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.577 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.578 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.579 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.580 187132 DEBUG nova.virt.libvirt.driver [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.607 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.608 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433764.530316, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.608 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.635 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.638 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433764.5378823, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.638 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.667 187132 INFO nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Took 10.02 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.667 187132 DEBUG nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.669 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.677 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.702 187132 DEBUG nova.network.neutron [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updated VIF entry in instance network info cache for port 50100b7e-dec5-41e3-a26a-a1b50420a48f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.702 187132 DEBUG nova.network.neutron [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.723 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.727 187132 DEBUG oslo_concurrency.lockutils [req-30e6dc2e-3bb5-44fe-abbe-3cfe1a588a46 req-30fb29af-2fdf-477b-872e-a5f961523f96 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.762 187132 INFO nova.compute.manager [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Took 11.07 seconds to build instance.#033[00m
Dec 11 01:16:04 np0005554845 nova_compute[187128]: 2025-12-11 06:16:04.784 187132 DEBUG oslo_concurrency.lockutils [None req-f56fbef0-6bd5-402a-8c92-d6111695d7b9 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 11.174s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.311 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433750.3098176, 4d2c2d90-6514-4e53-b77f-30e376bcb3ab => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.311 187132 INFO nova.compute.manager [-] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.327 187132 DEBUG nova.compute.manager [None req-7230c98f-340a-4fdf-a9fa-29bb5a59e4bd - - - - - -] [instance: 4d2c2d90-6514-4e53-b77f-30e376bcb3ab] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.378 187132 DEBUG nova.compute.manager [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.379 187132 DEBUG oslo_concurrency.lockutils [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.380 187132 DEBUG oslo_concurrency.lockutils [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.380 187132 DEBUG oslo_concurrency.lockutils [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.381 187132 DEBUG nova.compute.manager [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.381 187132 WARNING nova.compute.manager [req-c9185f5c-8f6b-46b8-a13c-b6650da64025 req-5ede0a91-39e7-4834-9b91-12173ffdc795 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state active and task_state None.#033[00m
Dec 11 01:16:05 np0005554845 nova_compute[187128]: 2025-12-11 06:16:05.614 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:06 np0005554845 nova_compute[187128]: 2025-12-11 06:16:06.742 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:07 np0005554845 podman[222265]: 2025-12-11 06:16:07.128336644 +0000 UTC m=+0.056100614 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:16:07 np0005554845 podman[222266]: 2025-12-11 06:16:07.149133489 +0000 UTC m=+0.069946010 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, config_id=edpm)
Dec 11 01:16:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:08Z|00255|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:08 np0005554845 nova_compute[187128]: 2025-12-11 06:16:08.376 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:08 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:08Z|00256|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:08 np0005554845 nova_compute[187128]: 2025-12-11 06:16:08.506 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:09 np0005554845 nova_compute[187128]: 2025-12-11 06:16:09.641 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:09 np0005554845 NetworkManager[55529]: <info>  [1765433769.6445] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/130)
Dec 11 01:16:09 np0005554845 NetworkManager[55529]: <info>  [1765433769.6458] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/131)
Dec 11 01:16:09 np0005554845 nova_compute[187128]: 2025-12-11 06:16:09.706 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:09Z|00257|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:09 np0005554845 nova_compute[187128]: 2025-12-11 06:16:09.716 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.021 187132 DEBUG nova.compute.manager [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.021 187132 DEBUG nova.compute.manager [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing instance network info cache due to event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.022 187132 DEBUG oslo_concurrency.lockutils [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.022 187132 DEBUG oslo_concurrency.lockutils [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.023 187132 DEBUG nova.network.neutron [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing network info cache for port 50100b7e-dec5-41e3-a26a-a1b50420a48f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:16:10 np0005554845 nova_compute[187128]: 2025-12-11 06:16:10.616 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:11 np0005554845 nova_compute[187128]: 2025-12-11 06:16:11.745 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:11 np0005554845 nova_compute[187128]: 2025-12-11 06:16:11.843 187132 DEBUG nova.network.neutron [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updated VIF entry in instance network info cache for port 50100b7e-dec5-41e3-a26a-a1b50420a48f. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:16:11 np0005554845 nova_compute[187128]: 2025-12-11 06:16:11.844 187132 DEBUG nova.network.neutron [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:11 np0005554845 nova_compute[187128]: 2025-12-11 06:16:11.864 187132 DEBUG oslo_concurrency.lockutils [req-9c5ad024-525b-43b6-a44f-d0a4acc01485 req-0578c8a6-9097-4427-9c58-71ffe38fbdcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:15 np0005554845 nova_compute[187128]: 2025-12-11 06:16:15.620 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:16Z|00034|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:c6:2b:0e 10.100.0.14
Dec 11 01:16:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:16Z|00035|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:c6:2b:0e 10.100.0.14
Dec 11 01:16:16 np0005554845 nova_compute[187128]: 2025-12-11 06:16:16.747 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:17 np0005554845 nova_compute[187128]: 2025-12-11 06:16:17.390 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:19Z|00258|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:19 np0005554845 nova_compute[187128]: 2025-12-11 06:16:19.213 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:20 np0005554845 nova_compute[187128]: 2025-12-11 06:16:20.119 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:20 np0005554845 nova_compute[187128]: 2025-12-11 06:16:20.622 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:21 np0005554845 nova_compute[187128]: 2025-12-11 06:16:21.044 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:21 np0005554845 nova_compute[187128]: 2025-12-11 06:16:21.751 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:21 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:21.967 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:87:94 10.100.0.2 2001:db8::f816:3eff:fe3f:8794'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:8794/64', 'neutron:device_id': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078ed33d-3a39-4095-bb26-184c0c14abff, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a972d4f-67a7-4d0d-9e44-3eec77085e79) old=Port_Binding(mac=['fa:16:3e:3f:87:94 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:21 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:21.969 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2a972d4f-67a7-4d0d-9e44-3eec77085e79 in datapath e8f46fd3-4213-49d6-9445-d5868c7b20f6 updated#033[00m
Dec 11 01:16:21 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:21.971 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8f46fd3-4213-49d6-9445-d5868c7b20f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:16:21 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:21.972 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[120819ce-7e38-4391-91d2-ea8441df234f]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:22 np0005554845 podman[222333]: 2025-12-11 06:16:22.145660182 +0000 UTC m=+0.070608858 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.279 187132 INFO nova.compute.manager [None req-72e0f5c3-7802-4fb5-a2eb-60ee8b10aff2 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Get console output#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.286 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.632 187132 DEBUG nova.objects.instance [None req-28932507-2f2b-4589-b7f2-3ba7f69f0ac1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.660 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433783.6605363, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.661 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.682 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.687 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: active, current task_state: suspending, current DB power_state: 1, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:16:23 np0005554845 nova_compute[187128]: 2025-12-11 06:16:23.706 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] During sync_power_state the instance has a pending task (suspending). Skip.#033[00m
Dec 11 01:16:24 np0005554845 kernel: tap50100b7e-de (unregistering): left promiscuous mode
Dec 11 01:16:24 np0005554845 NetworkManager[55529]: <info>  [1765433784.5997] device (tap50100b7e-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:16:24 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:24Z|00259|binding|INFO|Releasing lport 50100b7e-dec5-41e3-a26a-a1b50420a48f from this chassis (sb_readonly=0)
Dec 11 01:16:24 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:24Z|00260|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f down in Southbound
Dec 11 01:16:24 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:24Z|00261|binding|INFO|Removing iface tap50100b7e-de ovn-installed in OVS
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.609 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.618 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:2b:0e 10.100.0.14'], port_security=['fa:16:3e:c6:2b:0e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '4', 'neutron:security_group_ids': '78eb39e7-31de-4384-bd28-5cab9069ce58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfe2697e-ebea-497a-b5ca-c1d428531e27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=50100b7e-dec5-41e3-a26a-a1b50420a48f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.620 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 50100b7e-dec5-41e3-a26a-a1b50420a48f in datapath 5f7d50c2-325f-481b-ab67-1c19b7285e1a unbound from our chassis#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.622 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f7d50c2-325f-481b-ab67-1c19b7285e1a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.623 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[08a789c9-3334-4cd7-a945-9255e65ef3d5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.623 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a namespace which is not needed anymore#033[00m
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.632 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Deactivated successfully.
Dec 11 01:16:24 np0005554845 systemd[1]: machine-qemu\x2d18\x2dinstance\x2d00000027.scope: Consumed 14.086s CPU time.
Dec 11 01:16:24 np0005554845 systemd-machined[153381]: Machine qemu-18-instance-00000027 terminated.
Dec 11 01:16:24 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [NOTICE]   (222247) : haproxy version is 2.8.14-c23fe91
Dec 11 01:16:24 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [NOTICE]   (222247) : path to executable is /usr/sbin/haproxy
Dec 11 01:16:24 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [WARNING]  (222247) : Exiting Master process...
Dec 11 01:16:24 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [ALERT]    (222247) : Current worker (222249) exited with code 143 (Terminated)
Dec 11 01:16:24 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222243]: [WARNING]  (222247) : All workers exited. Exiting... (0)
Dec 11 01:16:24 np0005554845 systemd[1]: libpod-5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba.scope: Deactivated successfully.
Dec 11 01:16:24 np0005554845 podman[222387]: 2025-12-11 06:16:24.748845369 +0000 UTC m=+0.041297673 container died 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 01:16:24 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba-userdata-shm.mount: Deactivated successfully.
Dec 11 01:16:24 np0005554845 systemd[1]: var-lib-containers-storage-overlay-a4e7aaccc5ce4491281bd5afe23a3f51d0acc5b5ba3691ce921717a93de32956-merged.mount: Deactivated successfully.
Dec 11 01:16:24 np0005554845 podman[222387]: 2025-12-11 06:16:24.783038607 +0000 UTC m=+0.075490901 container cleanup 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 01:16:24 np0005554845 systemd[1]: libpod-conmon-5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba.scope: Deactivated successfully.
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.794 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.799 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.827 187132 DEBUG nova.compute.manager [None req-28932507-2f2b-4589-b7f2-3ba7f69f0ac1 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:24 np0005554845 podman[222418]: 2025-12-11 06:16:24.853435078 +0000 UTC m=+0.047415928 container remove 5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.859 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[93df20f5-608c-44e1-8a55-9f3f27384c84]: (4, ('Thu Dec 11 06:16:24 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a (5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba)\n5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba\nThu Dec 11 06:16:24 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a (5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba)\n5275f65b617aa641105ebabac90e13420263f0c0fbbdba7a0fcb8a47c93485ba\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.861 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[59810a68-457d-4dd0-aeee-d52aabfa0642]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.862 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f7d50c2-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.863 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 kernel: tap5f7d50c2-30: left promiscuous mode
Dec 11 01:16:24 np0005554845 nova_compute[187128]: 2025-12-11 06:16:24.878 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.880 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[df405c6c-be08-40bb-b48e-6b2673be19ad]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.900 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cb59c0b5-46d5-4482-b57f-9add30b8a01d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.901 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9cea7c9b-ebc2-4d4d-8935-1eeec8d7f2e3]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.918 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d746f928-ab56-4a10-b24e-07fa4d2c45b0]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 403770, 'reachable_time': 38703, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222448, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.920 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:16:24 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:24.920 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[60f46af0-0ebe-4c1b-8629-dfe175b4737b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:24 np0005554845 systemd[1]: run-netns-ovnmeta\x2d5f7d50c2\x2d325f\x2d481b\x2dab67\x2d1c19b7285e1a.mount: Deactivated successfully.
Dec 11 01:16:25 np0005554845 nova_compute[187128]: 2025-12-11 06:16:25.625 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:26.229 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:26.229 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:26.230 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.699 187132 DEBUG nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.700 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.701 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.702 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.702 187132 DEBUG nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.703 187132 WARNING nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state suspended and task_state None.#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.703 187132 DEBUG nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.704 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.704 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.704 187132 DEBUG oslo_concurrency.lockutils [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.705 187132 DEBUG nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.705 187132 WARNING nova.compute.manager [req-e3445cde-09e0-4999-8256-6c22788ea4f7 req-2fa06a07-ac7d-4bce-85e4-14c5150fba85 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state suspended and task_state None.#033[00m
Dec 11 01:16:26 np0005554845 nova_compute[187128]: 2025-12-11 06:16:26.754 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.786 187132 INFO nova.compute.manager [None req-60e79986-f0b8-40ac-8cb0-4cac78b4be8e 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Get console output#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.952 187132 INFO nova.compute.manager [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Resuming#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.953 187132 DEBUG nova.objects.instance [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'flavor' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.987 187132 DEBUG oslo_concurrency.lockutils [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.988 187132 DEBUG oslo_concurrency.lockutils [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:16:27 np0005554845 nova_compute[187128]: 2025-12-11 06:16:27.988 187132 DEBUG nova.network.neutron [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:16:28 np0005554845 podman[222449]: 2025-12-11 06:16:28.117279564 +0000 UTC m=+0.051110539 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.626 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.713 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.714 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.714 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.714 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.782 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:16:30 np0005554845 podman[222473]: 2025-12-11 06:16:30.79999813 +0000 UTC m=+0.043138002 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.838 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json" returned: 0 in 0.056s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.839 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:16:30 np0005554845 podman[222475]: 2025-12-11 06:16:30.870266638 +0000 UTC m=+0.110766009 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, container_name=ovn_controller, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0)
Dec 11 01:16:30 np0005554845 nova_compute[187128]: 2025-12-11 06:16:30.903 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c/disk --force-share --output=json" returned: 0 in 0.064s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.037 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.038 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5601MB free_disk=73.15740966796875GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.039 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.039 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.100 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance e8fcccda-d2bc-4e5a-b478-e526a1d2662c actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.101 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.101 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.139 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.157 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.177 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.178 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.139s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:31 np0005554845 nova_compute[187128]: 2025-12-11 06:16:31.757 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.178 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.690 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.690 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:16:32 np0005554845 nova_compute[187128]: 2025-12-11 06:16:32.717 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.424 187132 DEBUG nova.network.neutron [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.453 187132 DEBUG oslo_concurrency.lockutils [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.454 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.454 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.455 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.459 187132 DEBUG nova.virt.libvirt.vif [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-867218591',display_name='tempest-TestNetworkAdvancedServerOps-server-867218591',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-867218591',id=39,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL0AMCjhpLgAYbyW4RbNIsnfsfUIy5k9j3OYnQ76lUxwzuqHt5hm5bLSBBEruJUcifDTHX+MUrSG1utS0WCewPMauEm/KwEP/RhnwGQJFj9Xrh5egfQsGN9ZRRoEZNVLpQ==',key_name='tempest-TestNetworkAdvancedServerOps-1189574564',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:16:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=<?>,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=4,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-m03msnke',resources=<?>,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',old_vm_state='active',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='resuming',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:16:24Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e8fcccda-d2bc-4e5a-b478-e526a1d2662c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='suspended') vif={"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.460 187132 DEBUG nova.network.os_vif_util [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.461 187132 DEBUG nova.network.os_vif_util [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.461 187132 DEBUG os_vif [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.462 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.462 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.462 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.466 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.466 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap50100b7e-de, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.466 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap50100b7e-de, col_values=(('external_ids', {'iface-id': '50100b7e-dec5-41e3-a26a-a1b50420a48f', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:c6:2b:0e', 'vm-uuid': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.467 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.467 187132 INFO os_vif [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de')#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.484 187132 DEBUG nova.objects.instance [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'numa_topology' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:33 np0005554845 kernel: tap50100b7e-de: entered promiscuous mode
Dec 11 01:16:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:33Z|00262|binding|INFO|Claiming lport 50100b7e-dec5-41e3-a26a-a1b50420a48f for this chassis.
Dec 11 01:16:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:33Z|00263|binding|INFO|50100b7e-dec5-41e3-a26a-a1b50420a48f: Claiming fa:16:3e:c6:2b:0e 10.100.0.14
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.5837] manager: (tap50100b7e-de): new Tun device (/org/freedesktop/NetworkManager/Devices/132)
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.582 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.587 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:2b:0e 10.100.0.14'], port_security=['fa:16:3e:c6:2b:0e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '5', 'neutron:security_group_ids': '78eb39e7-31de-4384-bd28-5cab9069ce58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:port_fip': '192.168.122.176'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfe2697e-ebea-497a-b5ca-c1d428531e27, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=50100b7e-dec5-41e3-a26a-a1b50420a48f) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.588 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 50100b7e-dec5-41e3-a26a-a1b50420a48f in datapath 5f7d50c2-325f-481b-ab67-1c19b7285e1a bound to our chassis#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.589 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 5f7d50c2-325f-481b-ab67-1c19b7285e1a#033[00m
Dec 11 01:16:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:33Z|00264|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f ovn-installed in OVS
Dec 11 01:16:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:33Z|00265|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f up in Southbound
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.595 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.598 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.602 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1a042b80-7ce5-4b20-b58c-0c133d5a8d79]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.603 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap5f7d50c2-31 in ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.605 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap5f7d50c2-30 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.605 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9e863914-4f23-45c0-87b6-44a498afe7c6]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.606 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a2de43f9-ba9e-4272-9c5a-b47912e50b4c]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.618 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[2cb63cce-e08a-4a81-a022-ad1d41188f96]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 systemd-udevd[222552]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:16:33 np0005554845 systemd-machined[153381]: New machine qemu-19-instance-00000027.
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.6368] device (tap50100b7e-de): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.6377] device (tap50100b7e-de): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:16:33 np0005554845 systemd[1]: Started Virtual Machine qemu-19-instance-00000027.
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.644 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b52ed470-4ba7-46e7-8643-2fec22d8c47d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 podman[222530]: 2025-12-11 06:16:33.64894728 +0000 UTC m=+0.073969479 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd)
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.672 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[6cf1f536-87c9-4ba6-98de-29401d9b32e6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.677 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[df171644-88f9-4a4d-9ba6-c817329c16f0]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 systemd-udevd[222561]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.6780] manager: (tap5f7d50c2-30): new Veth device (/org/freedesktop/NetworkManager/Devices/133)
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.715 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[08341299-bedb-430e-b610-e67960f5789f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.718 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[06318172-9ed9-48da-9e21-1b5bea51a20d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.7422] device (tap5f7d50c2-30): carrier: link connected
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.748 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[fdb79287-e3ba-48a7-9efb-38955ea92941]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.770 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[328df79f-e806-4f74-8a2a-3ec2a3334ec0]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f7d50c2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:71:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406838, 'reachable_time': 31303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222590, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.784 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6df77148-2936-4c37-8462-4c1cb60a0b6f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe7f:71ab'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 406838, 'tstamp': 406838}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 222591, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.803 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[12d194ba-e3a5-4bc5-be38-a20ff61d14b4]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap5f7d50c2-31'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:7f:71:ab'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 79], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406838, 'reachable_time': 31303, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 222592, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.807 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:3f:87:94 10.100.0.2 2001:db8:0:1:f816:3eff:fe3f:8794 2001:db8::f816:3eff:fe3f:8794'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:fe3f:8794/64 2001:db8::f816:3eff:fe3f:8794/64', 'neutron:device_id': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078ed33d-3a39-4095-bb26-184c0c14abff, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2a972d4f-67a7-4d0d-9e44-3eec77085e79) old=Port_Binding(mac=['fa:16:3e:3f:87:94 10.100.0.2 2001:db8::f816:3eff:fe3f:8794'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:fe3f:8794/64', 'neutron:device_id': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.834 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[158897c4-1fdb-4517-923a-42b4407edda3]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.896 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[25eedd0e-0c88-4110-a06d-f525baf6d34d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.897 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f7d50c2-30, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.898 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.898 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap5f7d50c2-30, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 NetworkManager[55529]: <info>  [1765433793.9006] manager: (tap5f7d50c2-30): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/134)
Dec 11 01:16:33 np0005554845 kernel: tap5f7d50c2-30: entered promiscuous mode
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.902 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.902 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap5f7d50c2-30, col_values=(('external_ids', {'iface-id': '825c1ff3-5669-4fe9-9c6a-2a8b74e06612'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:33 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:33Z|00266|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:33 np0005554845 nova_compute[187128]: 2025-12-11 06:16:33.916 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.917 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.917 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[97a6a80c-87d9-4925-b663-93703bec1734]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.918 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-5f7d50c2-325f-481b-ab67-1c19b7285e1a
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/5f7d50c2-325f-481b-ab67-1c19b7285e1a.pid.haproxy
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 5f7d50c2-325f-481b-ab67-1c19b7285e1a
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:16:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:33.920 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'env', 'PROCESS_TAG=haproxy-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/5f7d50c2-325f-481b-ab67-1c19b7285e1a.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.047 187132 DEBUG nova.compute.manager [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.048 187132 DEBUG oslo_concurrency.lockutils [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.048 187132 DEBUG oslo_concurrency.lockutils [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.048 187132 DEBUG oslo_concurrency.lockutils [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.048 187132 DEBUG nova.compute.manager [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.049 187132 WARNING nova.compute.manager [req-e6ee5b18-c01a-4b6d-8645-31da461d67b9 req-769ea94a-6144-4e1a-94a3-e24bc44964e9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state suspended and task_state resuming.#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.267 187132 DEBUG nova.virt.libvirt.host [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Removed pending event for e8fcccda-d2bc-4e5a-b478-e526a1d2662c due to event _event_emit_delayed /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:438#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.268 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433794.2670605, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.269 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Started (Lifecycle Event)#033[00m
Dec 11 01:16:34 np0005554845 podman[222631]: 2025-12-11 06:16:34.27942736 +0000 UTC m=+0.051608192 container create 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.296 187132 DEBUG nova.compute.manager [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance event wait completed in 0 seconds for  wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.296 187132 DEBUG nova.objects.instance [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'pci_devices' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.303 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.306 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Synchronizing instance power state after lifecycle event "Started"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:16:34 np0005554845 systemd[1]: Started libpod-conmon-855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490.scope.
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.336 187132 INFO nova.virt.libvirt.driver [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance running successfully.#033[00m
Dec 11 01:16:34 np0005554845 virtqemud[186638]: argument unsupported: QEMU guest agent is not configured
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.339 187132 DEBUG nova.virt.libvirt.guest [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Failed to set time: agent not configured sync_guest_time /usr/lib/python3.9/site-packages/nova/virt/libvirt/guest.py:200#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.339 187132 DEBUG nova.compute.manager [None req-a4ca8834-115d-49a2-a091-5c25d100cb1a 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:34 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:16:34 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/ebf4afaaf6aef514a5e9e0b3bf622d5dac6e6d75053c27cd3a9e675923d6a862/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:16:34 np0005554845 podman[222631]: 2025-12-11 06:16:34.25324616 +0000 UTC m=+0.025426992 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:16:34 np0005554845 podman[222631]: 2025-12-11 06:16:34.358713863 +0000 UTC m=+0.130894695 container init 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:16:34 np0005554845 podman[222631]: 2025-12-11 06:16:34.363930975 +0000 UTC m=+0.136111807 container start 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.383 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.384 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433794.2768362, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.384 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:16:34 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [NOTICE]   (222650) : New worker (222652) forked
Dec 11 01:16:34 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [NOTICE]   (222650) : Loading success.
Dec 11 01:16:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:34.421 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 2a972d4f-67a7-4d0d-9e44-3eec77085e79 in datapath e8f46fd3-4213-49d6-9445-d5868c7b20f6 unbound from our chassis#033[00m
Dec 11 01:16:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:34.423 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8f46fd3-4213-49d6-9445-d5868c7b20f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:16:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:34.424 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[acd166b7-4f60-44b2-a32f-52ebe7c3d016]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.430 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.434 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: suspended, current task_state: resuming, current DB power_state: 4, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.461 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] During sync_power_state the instance has a pending task (resuming). Skip.#033[00m
Dec 11 01:16:34 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:34Z|00267|binding|INFO|Releasing lport 825c1ff3-5669-4fe9-9c6a-2a8b74e06612 from this chassis (sb_readonly=0)
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.675 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.764 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [{"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.783 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.783 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.784 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.785 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.785 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:34 np0005554845 nova_compute[187128]: 2025-12-11 06:16:34.785 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:16:35 np0005554845 nova_compute[187128]: 2025-12-11 06:16:35.628 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:35 np0005554845 nova_compute[187128]: 2025-12-11 06:16:35.658 187132 INFO nova.compute.manager [None req-9e3f4484-f7bb-448a-867b-a3c1ae93dc80 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Get console output#033[00m
Dec 11 01:16:35 np0005554845 nova_compute[187128]: 2025-12-11 06:16:35.664 213770 INFO nova.privsep.libvirt [-] Ignored error while reading from instance console pty: can't concat NoneType to bytes#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.263 187132 DEBUG nova.compute.manager [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.264 187132 DEBUG oslo_concurrency.lockutils [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.265 187132 DEBUG oslo_concurrency.lockutils [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.265 187132 DEBUG oslo_concurrency.lockutils [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.265 187132 DEBUG nova.compute.manager [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.265 187132 WARNING nova.compute.manager [req-26d52717-c4c4-4d04-9f29-57f8995da461 req-67720aea-2f6f-418c-abaa-f1af3b5f244a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state active and task_state None.#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.599 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.599 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.600 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.600 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.600 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.601 187132 INFO nova.compute.manager [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Terminating instance#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.602 187132 DEBUG nova.compute.manager [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:16:36 np0005554845 kernel: tap50100b7e-de (unregistering): left promiscuous mode
Dec 11 01:16:36 np0005554845 NetworkManager[55529]: <info>  [1765433796.6191] device (tap50100b7e-de): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:16:36 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:36Z|00268|binding|INFO|Releasing lport 50100b7e-dec5-41e3-a26a-a1b50420a48f from this chassis (sb_readonly=0)
Dec 11 01:16:36 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:36Z|00269|binding|INFO|Setting lport 50100b7e-dec5-41e3-a26a-a1b50420a48f down in Southbound
Dec 11 01:16:36 np0005554845 ovn_controller[95428]: 2025-12-11T06:16:36Z|00270|binding|INFO|Removing iface tap50100b7e-de ovn-installed in OVS
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.627 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.630 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.647 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:c6:2b:0e 10.100.0.14'], port_security=['fa:16:3e:c6:2b:0e 10.100.0.14'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.14/28', 'neutron:device_id': 'e8fcccda-d2bc-4e5a-b478-e526a1d2662c', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '3ec4c03cd7274517b88d9087ad4cbd83', 'neutron:revision_number': '6', 'neutron:security_group_ids': '78eb39e7-31de-4384-bd28-5cab9069ce58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=dfe2697e-ebea-497a-b5ca-c1d428531e27, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=50100b7e-dec5-41e3-a26a-a1b50420a48f) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.649 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 50100b7e-dec5-41e3-a26a-a1b50420a48f in datapath 5f7d50c2-325f-481b-ab67-1c19b7285e1a unbound from our chassis#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.651 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.652 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 5f7d50c2-325f-481b-ab67-1c19b7285e1a, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.653 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6c55705a-803e-470a-a0a4-9a97154984f5]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.653 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a namespace which is not needed anymore#033[00m
Dec 11 01:16:36 np0005554845 systemd[1]: machine-qemu\x2d19\x2dinstance\x2d00000027.scope: Deactivated successfully.
Dec 11 01:16:36 np0005554845 systemd-machined[153381]: Machine qemu-19-instance-00000027 terminated.
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.759 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [NOTICE]   (222650) : haproxy version is 2.8.14-c23fe91
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [NOTICE]   (222650) : path to executable is /usr/sbin/haproxy
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [WARNING]  (222650) : Exiting Master process...
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [WARNING]  (222650) : Exiting Master process...
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [ALERT]    (222650) : Current worker (222652) exited with code 143 (Terminated)
Dec 11 01:16:36 np0005554845 neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a[222646]: [WARNING]  (222650) : All workers exited. Exiting... (0)
Dec 11 01:16:36 np0005554845 systemd[1]: libpod-855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490.scope: Deactivated successfully.
Dec 11 01:16:36 np0005554845 podman[222685]: 2025-12-11 06:16:36.787933235 +0000 UTC m=+0.042674039 container died 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202)
Dec 11 01:16:36 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490-userdata-shm.mount: Deactivated successfully.
Dec 11 01:16:36 np0005554845 systemd[1]: var-lib-containers-storage-overlay-ebf4afaaf6aef514a5e9e0b3bf622d5dac6e6d75053c27cd3a9e675923d6a862-merged.mount: Deactivated successfully.
Dec 11 01:16:36 np0005554845 NetworkManager[55529]: <info>  [1765433796.8182] manager: (tap50100b7e-de): new Tun device (/org/freedesktop/NetworkManager/Devices/135)
Dec 11 01:16:36 np0005554845 kernel: tap50100b7e-de: entered promiscuous mode
Dec 11 01:16:36 np0005554845 kernel: tap50100b7e-de (unregistering): left promiscuous mode
Dec 11 01:16:36 np0005554845 podman[222685]: 2025-12-11 06:16:36.827082589 +0000 UTC m=+0.081823413 container cleanup 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.826 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 systemd[1]: libpod-conmon-855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490.scope: Deactivated successfully.
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.853 187132 INFO nova.virt.libvirt.driver [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Instance destroyed successfully.#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.854 187132 DEBUG nova.objects.instance [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lazy-loading 'resources' on Instance uuid e8fcccda-d2bc-4e5a-b478-e526a1d2662c obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.867 187132 DEBUG nova.virt.libvirt.vif [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:15:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestNetworkAdvancedServerOps-server-867218591',display_name='tempest-TestNetworkAdvancedServerOps-server-867218591',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testnetworkadvancedserverops-server-867218591',id=39,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBL0AMCjhpLgAYbyW4RbNIsnfsfUIy5k9j3OYnQ76lUxwzuqHt5hm5bLSBBEruJUcifDTHX+MUrSG1utS0WCewPMauEm/KwEP/RhnwGQJFj9Xrh5egfQsGN9ZRRoEZNVLpQ==',key_name='tempest-TestNetworkAdvancedServerOps-1189574564',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:16:04Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='3ec4c03cd7274517b88d9087ad4cbd83',ramdisk_id='',reservation_id='r-m03msnke',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='member,reader',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestNetworkAdvancedServerOps-369129245',owner_user_name='tempest-TestNetworkAdvancedServerOps-369129245-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:16:34Z,user_data=None,user_id='40cb523bfe1e4484bb2e91c903500c97',uuid=e8fcccda-d2bc-4e5a-b478-e526a1d2662c,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.867 187132 DEBUG nova.network.os_vif_util [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converting VIF {"id": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "address": "fa:16:3e:c6:2b:0e", "network": {"id": "5f7d50c2-325f-481b-ab67-1c19b7285e1a", "bridge": "br-int", "label": "tempest-network-smoke--228209366", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.14", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.176", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "3ec4c03cd7274517b88d9087ad4cbd83", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap50100b7e-de", "ovs_interfaceid": "50100b7e-dec5-41e3-a26a-a1b50420a48f", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.868 187132 DEBUG nova.network.os_vif_util [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.868 187132 DEBUG os_vif [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Unplugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.871 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.871 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap50100b7e-de, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.872 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.874 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.876 187132 INFO os_vif [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:c6:2b:0e,bridge_name='br-int',has_traffic_filtering=True,id=50100b7e-dec5-41e3-a26a-a1b50420a48f,network=Network(5f7d50c2-325f-481b-ab67-1c19b7285e1a),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap50100b7e-de')#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.877 187132 INFO nova.virt.libvirt.driver [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Deleting instance files /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c_del#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.877 187132 INFO nova.virt.libvirt.driver [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Deletion of /var/lib/nova/instances/e8fcccda-d2bc-4e5a-b478-e526a1d2662c_del complete#033[00m
Dec 11 01:16:36 np0005554845 podman[222724]: 2025-12-11 06:16:36.894421347 +0000 UTC m=+0.040801199 container remove 855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.899 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8900c4c2-e198-4b85-b329-2adccea2c109]: (4, ('Thu Dec 11 06:16:36 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a (855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490)\n855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490\nThu Dec 11 06:16:36 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a (855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490)\n855688f762ec1a35ac802521eaa537c58d62696a2f08748355e8a5c8cd444490\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.901 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0087ccf9-1618-4cb8-8140-23906ef50783]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.902 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap5f7d50c2-30, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.903 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 kernel: tap5f7d50c2-30: left promiscuous mode
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.905 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.907 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fe368281-7919-4976-a5b0-d96c4fafa847]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.918 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.922 187132 INFO nova.compute.manager [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Took 0.32 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.923 187132 DEBUG oslo.service.loopingcall [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.923 187132 DEBUG nova.compute.manager [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:16:36 np0005554845 nova_compute[187128]: 2025-12-11 06:16:36.923 187132 DEBUG nova.network.neutron [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.932 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[30056fd8-015c-49fc-bc7f-a3dea31401f6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.933 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6012b3e0-1fc6-4319-a82c-eb216d415a79]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.949 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[26f3182d-4927-45f0-969b-6530567498e9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 406831, 'reachable_time': 41186, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 222743, 'error': None, 'target': 'ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.951 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-5f7d50c2-325f-481b-ab67-1c19b7285e1a deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:16:36 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:36.951 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[f738de02-e6b5-43fa-84ee-cfcfe3d20150]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:16:36 np0005554845 systemd[1]: run-netns-ovnmeta\x2d5f7d50c2\x2d325f\x2d481b\x2dab67\x2d1c19b7285e1a.mount: Deactivated successfully.
Dec 11 01:16:38 np0005554845 podman[222745]: 2025-12-11 06:16:38.121803026 +0000 UTC m=+0.054797049 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, distribution-scope=public, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, vendor=Red Hat, Inc., architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, io.openshift.expose-services=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6)
Dec 11 01:16:38 np0005554845 podman[222744]: 2025-12-11 06:16:38.130896872 +0000 UTC m=+0.063414183 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.721 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.722 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.775 187132 DEBUG nova.compute.manager [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.775 187132 DEBUG nova.compute.manager [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing instance network info cache due to event network-changed-50100b7e-dec5-41e3-a26a-a1b50420a48f. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.775 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.776 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:16:38 np0005554845 nova_compute[187128]: 2025-12-11 06:16:38.776 187132 DEBUG nova.network.neutron [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Refreshing network info cache for port 50100b7e-dec5-41e3-a26a-a1b50420a48f _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.184 187132 INFO nova.network.neutron [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Port 50100b7e-dec5-41e3-a26a-a1b50420a48f from network info_cache is no longer associated with instance in Neutron. Removing from network info_cache.#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.185 187132 DEBUG nova.network.neutron [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:39.207 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=15, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=14) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.208 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:39 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:39.208 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 8 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.224 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-e8fcccda-d2bc-4e5a-b478-e526a1d2662c" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.225 187132 DEBUG nova.compute.manager [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.226 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.226 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.227 187132 DEBUG oslo_concurrency.lockutils [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.227 187132 DEBUG nova.compute.manager [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.227 187132 DEBUG nova.compute.manager [req-dc236b1b-8c49-48eb-93bd-6594df2247ef req-e4cf7275-683e-4776-a64d-e65488b0d618 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-unplugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.241 187132 DEBUG nova.network.neutron [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.265 187132 INFO nova.compute.manager [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Took 2.34 seconds to deallocate network for instance.#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.331 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.332 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.398 187132 DEBUG nova.compute.manager [req-95f4266d-8087-4e64-aacc-43a95a3f1989 req-bf05acc1-b283-45fa-bc01-883c8aeac79f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-deleted-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.403 187132 DEBUG nova.compute.provider_tree [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.420 187132 DEBUG nova.scheduler.client.report [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.439 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.108s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.466 187132 INFO nova.scheduler.client.report [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Deleted allocations for instance e8fcccda-d2bc-4e5a-b478-e526a1d2662c#033[00m
Dec 11 01:16:39 np0005554845 nova_compute[187128]: 2025-12-11 06:16:39.542 187132 DEBUG oslo_concurrency.lockutils [None req-d888a4a5-7b03-4f98-a7d9-2500a0d4f406 40cb523bfe1e4484bb2e91c903500c97 3ec4c03cd7274517b88d9087ad4cbd83 - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.943s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.631 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.951 187132 DEBUG nova.compute.manager [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.952 187132 DEBUG oslo_concurrency.lockutils [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.952 187132 DEBUG oslo_concurrency.lockutils [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.953 187132 DEBUG oslo_concurrency.lockutils [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "e8fcccda-d2bc-4e5a-b478-e526a1d2662c-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.953 187132 DEBUG nova.compute.manager [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] No waiting events found dispatching network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:16:40 np0005554845 nova_compute[187128]: 2025-12-11 06:16:40.953 187132 WARNING nova.compute.manager [req-fc061a5e-c34b-4a49-9248-e31d6a5d728c req-424ddac9-c8cc-4ae0-b6ed-58a51ff8f0bc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Received unexpected event network-vif-plugged-50100b7e-dec5-41e3-a26a-a1b50420a48f for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:16:41 np0005554845 nova_compute[187128]: 2025-12-11 06:16:41.874 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:43 np0005554845 nova_compute[187128]: 2025-12-11 06:16:43.690 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:43 np0005554845 nova_compute[187128]: 2025-12-11 06:16:43.796 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:45 np0005554845 nova_compute[187128]: 2025-12-11 06:16:45.634 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:46 np0005554845 nova_compute[187128]: 2025-12-11 06:16:46.878 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:16:47.210 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '15'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:16:50 np0005554845 nova_compute[187128]: 2025-12-11 06:16:50.636 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:51 np0005554845 nova_compute[187128]: 2025-12-11 06:16:51.852 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433796.8520186, e8fcccda-d2bc-4e5a-b478-e526a1d2662c => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:16:51 np0005554845 nova_compute[187128]: 2025-12-11 06:16:51.853 187132 INFO nova.compute.manager [-] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:16:51 np0005554845 nova_compute[187128]: 2025-12-11 06:16:51.881 187132 DEBUG nova.compute.manager [None req-82c517b0-8f7a-4aef-b375-7a80e364f1b8 - - - - - -] [instance: e8fcccda-d2bc-4e5a-b478-e526a1d2662c] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:16:51 np0005554845 nova_compute[187128]: 2025-12-11 06:16:51.882 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:53 np0005554845 podman[222789]: 2025-12-11 06:16:53.11431197 +0000 UTC m=+0.052239570 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:16:55 np0005554845 nova_compute[187128]: 2025-12-11 06:16:55.637 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:56 np0005554845 nova_compute[187128]: 2025-12-11 06:16:56.885 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:16:59 np0005554845 podman[222814]: 2025-12-11 06:16:59.150749254 +0000 UTC m=+0.077714872 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS)
Dec 11 01:17:00 np0005554845 nova_compute[187128]: 2025-12-11 06:17:00.640 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:01 np0005554845 podman[222834]: 2025-12-11 06:17:01.127518512 +0000 UTC m=+0.055184411 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:17:01 np0005554845 podman[222835]: 2025-12-11 06:17:01.157413673 +0000 UTC m=+0.083997452 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller)
Dec 11 01:17:01 np0005554845 nova_compute[187128]: 2025-12-11 06:17:01.887 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:04 np0005554845 podman[222878]: 2025-12-11 06:17:04.118282493 +0000 UTC m=+0.055856498 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, container_name=multipathd, org.label-schema.license=GPLv2)
Dec 11 01:17:05 np0005554845 nova_compute[187128]: 2025-12-11 06:17:05.642 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:06 np0005554845 nova_compute[187128]: 2025-12-11 06:17:06.889 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:09 np0005554845 podman[222898]: 2025-12-11 06:17:09.132444448 +0000 UTC m=+0.064506173 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:17:09 np0005554845 podman[222899]: 2025-12-11 06:17:09.133040893 +0000 UTC m=+0.064554624 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., build-date=2025-08-20T13:12:41, distribution-scope=public, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., container_name=openstack_network_exporter, managed_by=edpm_ansible, version=9.6, io.buildah.version=1.33.7)
Dec 11 01:17:10 np0005554845 nova_compute[187128]: 2025-12-11 06:17:10.700 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:11 np0005554845 nova_compute[187128]: 2025-12-11 06:17:11.893 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:12 np0005554845 nova_compute[187128]: 2025-12-11 06:17:12.987 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:12 np0005554845 nova_compute[187128]: 2025-12-11 06:17:12.987 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.008 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.073 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.074 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.082 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.083 187132 INFO nova.compute.claims [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.190 187132 DEBUG nova.compute.provider_tree [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.210 187132 DEBUG nova.scheduler.client.report [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.231 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.157s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.232 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.309 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.310 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.332 187132 INFO nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.355 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.498 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.499 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.499 187132 INFO nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Creating image(s)#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.500 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.500 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.500 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.513 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.533 187132 DEBUG nova.policy [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.572 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.573 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.574 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.589 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.682 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.093s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.683 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.724 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk 1073741824" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.725 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.152s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.726 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.804 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.078s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.806 187132 DEBUG nova.virt.disk.api [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.806 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.890 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json" returned: 0 in 0.084s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.891 187132 DEBUG nova.virt.disk.api [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.891 187132 DEBUG nova.objects.instance [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 45a86888-e6a3-42e0-a383-d78cdd0e25fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.903 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.904 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Ensure instance console log exists: /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.905 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.905 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:13 np0005554845 nova_compute[187128]: 2025-12-11 06:17:13.905 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:15 np0005554845 nova_compute[187128]: 2025-12-11 06:17:15.703 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:15 np0005554845 nova_compute[187128]: 2025-12-11 06:17:15.775 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Successfully created port: 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.887 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Successfully updated port: 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.896 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.900 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.901 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.901 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.957 187132 DEBUG nova.compute.manager [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.957 187132 DEBUG nova.compute.manager [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing instance network info cache due to event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:17:16 np0005554845 nova_compute[187128]: 2025-12-11 06:17:16.958 187132 DEBUG oslo_concurrency.lockutils [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:17 np0005554845 nova_compute[187128]: 2025-12-11 06:17:17.020 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.784 187132 DEBUG nova.network.neutron [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.828 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.829 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Instance network_info: |[{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.830 187132 DEBUG oslo_concurrency.lockutils [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.831 187132 DEBUG nova.network.neutron [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.836 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Start _get_guest_xml network_info=[{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.843 187132 WARNING nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.848 187132 DEBUG nova.virt.libvirt.host [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.850 187132 DEBUG nova.virt.libvirt.host [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.855 187132 DEBUG nova.virt.libvirt.host [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.857 187132 DEBUG nova.virt.libvirt.host [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.858 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.859 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.860 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.860 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.861 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.861 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.862 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.862 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.863 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.863 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.863 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.864 187132 DEBUG nova.virt.hardware [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.871 187132 DEBUG nova.virt.libvirt.vif [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:17:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-538885827',display_name='tempest-TestGettingAddress-server-538885827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-538885827',id=42,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGjbLyg1YoDhHRFVFNs2iD0sgE6iKgmDCYJJhk9ForuYdpOT3u1ErL+6vfB0W0+wJQ87rVKekA4NcoJilcp+eACrmScyyWa4ZYGGh/WHn+bQNFNeaoBE9WbIGywRdxMdug==',key_name='tempest-TestGettingAddress-178936991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-d7equdrc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:17:13Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=45a86888-e6a3-42e0-a383-d78cdd0e25fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.872 187132 DEBUG nova.network.os_vif_util [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.873 187132 DEBUG nova.network.os_vif_util [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.875 187132 DEBUG nova.objects.instance [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 45a86888-e6a3-42e0-a383-d78cdd0e25fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.896 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <uuid>45a86888-e6a3-42e0-a383-d78cdd0e25fd</uuid>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <name>instance-0000002a</name>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-538885827</nova:name>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:17:18</nova:creationTime>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        <nova:port uuid="0ce80b63-298a-4ec4-9d0a-8a7632ca6b57">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:feb9:7e24" ipVersion="6"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.10" ipVersion="4"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:feb9:7e24" ipVersion="6"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="serial">45a86888-e6a3-42e0-a383-d78cdd0e25fd</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="uuid">45a86888-e6a3-42e0-a383-d78cdd0e25fd</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.config"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:b9:7e:24"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <target dev="tap0ce80b63-29"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/console.log" append="off"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:17:18 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:17:18 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:17:18 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:17:18 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.898 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Preparing to wait for external event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.899 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.900 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.900 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.901 187132 DEBUG nova.virt.libvirt.vif [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:17:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-538885827',display_name='tempest-TestGettingAddress-server-538885827',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-538885827',id=42,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGjbLyg1YoDhHRFVFNs2iD0sgE6iKgmDCYJJhk9ForuYdpOT3u1ErL+6vfB0W0+wJQ87rVKekA4NcoJilcp+eACrmScyyWa4ZYGGh/WHn+bQNFNeaoBE9WbIGywRdxMdug==',key_name='tempest-TestGettingAddress-178936991',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-d7equdrc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:17:13Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=45a86888-e6a3-42e0-a383-d78cdd0e25fd,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.902 187132 DEBUG nova.network.os_vif_util [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.903 187132 DEBUG nova.network.os_vif_util [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.904 187132 DEBUG os_vif [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.905 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.905 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.906 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.910 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.910 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0ce80b63-29, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.911 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap0ce80b63-29, col_values=(('external_ids', {'iface-id': '0ce80b63-298a-4ec4-9d0a-8a7632ca6b57', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:b9:7e:24', 'vm-uuid': '45a86888-e6a3-42e0-a383-d78cdd0e25fd'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.914 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:18 np0005554845 NetworkManager[55529]: <info>  [1765433838.9157] manager: (tap0ce80b63-29): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/136)
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.919 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.921 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.923 187132 INFO os_vif [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29')#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.985 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.987 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.987 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:b9:7e:24, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:17:18 np0005554845 nova_compute[187128]: 2025-12-11 06:17:18.987 187132 INFO nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Using config drive#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.613 187132 INFO nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Creating config drive at /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.config#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.618 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvpv7yste execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.748 187132 DEBUG oslo_concurrency.processutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpvpv7yste" returned: 0 in 0.129s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:19 np0005554845 kernel: tap0ce80b63-29: entered promiscuous mode
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.8142] manager: (tap0ce80b63-29): new Tun device (/org/freedesktop/NetworkManager/Devices/137)
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.814 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:19Z|00271|binding|INFO|Claiming lport 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 for this chassis.
Dec 11 01:17:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:19Z|00272|binding|INFO|0ce80b63-298a-4ec4-9d0a-8a7632ca6b57: Claiming fa:16:3e:b9:7e:24 10.100.0.10 2001:db8:0:1:f816:3eff:feb9:7e24 2001:db8::f816:3eff:feb9:7e24
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.818 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.823 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.826 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.831 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.8336] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/138)
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.8348] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/139)
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.836 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:7e:24 10.100.0.10 2001:db8:0:1:f816:3eff:feb9:7e24 2001:db8::f816:3eff:feb9:7e24'], port_security=['fa:16:3e:b9:7e:24 10.100.0.10 2001:db8:0:1:f816:3eff:feb9:7e24 2001:db8::f816:3eff:feb9:7e24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:feb9:7e24/64 2001:db8::f816:3eff:feb9:7e24/64', 'neutron:device_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2a302d93-d825-40ba-a363-74d1dc48857e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078ed33d-3a39-4095-bb26-184c0c14abff, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.837 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 in datapath e8f46fd3-4213-49d6-9445-d5868c7b20f6 bound to our chassis#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.839 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network e8f46fd3-4213-49d6-9445-d5868c7b20f6#033[00m
Dec 11 01:17:19 np0005554845 systemd-machined[153381]: New machine qemu-20-instance-0000002a.
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.854 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[dd5a7a07-f80e-4ec9-a4ec-136b40837add]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.855 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tape8f46fd3-41 in ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:17:19 np0005554845 systemd-udevd[222977]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.857 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tape8f46fd3-40 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.857 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3487eea3-e1c2-4f0c-b57d-f71ea95651cb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.858 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2837611b-b37b-40ae-941c-73d263810222]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.8687] device (tap0ce80b63-29): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.8695] device (tap0ce80b63-29): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.869 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[1243590b-bcee-476c-a43a-b3390655e6a1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 systemd[1]: Started Virtual Machine qemu-20-instance-0000002a.
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.899 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4deb518b-69f2-4abc-84cc-54269cee45d9]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.928 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[674ca732-d919-45f4-b9c5-a99f7fa44c84]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 NetworkManager[55529]: <info>  [1765433839.9360] manager: (tape8f46fd3-40): new Veth device (/org/freedesktop/NetworkManager/Devices/140)
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.935 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c6a8f514-abcb-4546-b360-d87bc73ff40d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 systemd-udevd[222980]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.945 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.961 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:19Z|00273|binding|INFO|Setting lport 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 ovn-installed in OVS
Dec 11 01:17:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:19Z|00274|binding|INFO|Setting lport 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 up in Southbound
Dec 11 01:17:19 np0005554845 nova_compute[187128]: 2025-12-11 06:17:19.966 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.973 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[5786c180-6138-4948-8bef-2c3e2980e549]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:19 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:19.977 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[d0f8a5f1-f7ab-47c9-a4c3-403ae73b655f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 NetworkManager[55529]: <info>  [1765433840.0011] device (tape8f46fd3-40): carrier: link connected
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.009 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[81082caf-5df5-4b0b-b855-be7b47a5a74d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.034 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a210c5d2-842d-4899-8f58-c6d295c60306]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8f46fd3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:87:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411464, 'reachable_time': 38863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223009, 'error': None, 'target': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.057 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1f6ecd50-7d98-4bdf-a653-d566b19dc6ea]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe3f:8794'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 411464, 'tstamp': 411464}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223010, 'error': None, 'target': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.081 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[140de96f-a1e3-4880-82d1-05f514e69803]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tape8f46fd3-41'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:3f:87:94'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 82], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411464, 'reachable_time': 38863, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223012, 'error': None, 'target': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.120 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[50477b4f-72b1-4f43-9b84-3b738a9688fc]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.168 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433840.167597, 45a86888-e6a3-42e0-a383-d78cdd0e25fd => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.169 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] VM Started (Lifecycle Event)#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.180 187132 DEBUG nova.compute.manager [req-07f7e756-81ea-446c-a04a-374e8c369327 req-45effd4c-aeec-4324-833b-8b3c37db8467 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.182 187132 DEBUG oslo_concurrency.lockutils [req-07f7e756-81ea-446c-a04a-374e8c369327 req-45effd4c-aeec-4324-833b-8b3c37db8467 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.183 187132 DEBUG oslo_concurrency.lockutils [req-07f7e756-81ea-446c-a04a-374e8c369327 req-45effd4c-aeec-4324-833b-8b3c37db8467 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.183 187132 DEBUG oslo_concurrency.lockutils [req-07f7e756-81ea-446c-a04a-374e8c369327 req-45effd4c-aeec-4324-833b-8b3c37db8467 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.184 187132 DEBUG nova.compute.manager [req-07f7e756-81ea-446c-a04a-374e8c369327 req-45effd4c-aeec-4324-833b-8b3c37db8467 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Processing event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.186 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.186 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[40738e6e-1d34-4f2d-be60-6021751507a6]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.188 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8f46fd3-40, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.189 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.190 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape8f46fd3-40, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.191 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.192 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:20 np0005554845 kernel: tape8f46fd3-40: entered promiscuous mode
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.194 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:20 np0005554845 NetworkManager[55529]: <info>  [1765433840.1963] manager: (tape8f46fd3-40): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/141)
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.197 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tape8f46fd3-40, col_values=(('external_ids', {'iface-id': '2a972d4f-67a7-4d0d-9e44-3eec77085e79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.197 187132 INFO nova.virt.libvirt.driver [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Instance spawned successfully.#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.198 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:17:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:20Z|00275|binding|INFO|Releasing lport 2a972d4f-67a7-4d0d-9e44-3eec77085e79 from this chassis (sb_readonly=0)
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.199 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.200 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/e8f46fd3-4213-49d6-9445-d5868c7b20f6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/e8f46fd3-4213-49d6-9445-d5868c7b20f6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.201 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.201 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9fc4e7ac-8d08-49be-99b6-d3983d6b3bbd]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.204 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-e8f46fd3-4213-49d6-9445-d5868c7b20f6
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/e8f46fd3-4213-49d6-9445-d5868c7b20f6.pid.haproxy
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID e8f46fd3-4213-49d6-9445-d5868c7b20f6
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.206 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:17:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:20.206 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'env', 'PROCESS_TAG=haproxy-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/e8f46fd3-4213-49d6-9445-d5868c7b20f6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.217 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.223 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.223 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.224 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.224 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.225 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.225 187132 DEBUG nova.virt.libvirt.driver [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.229 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.229 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433840.1739, 45a86888-e6a3-42e0-a383-d78cdd0e25fd => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.230 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.269 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.273 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433840.1905496, 45a86888-e6a3-42e0-a383-d78cdd0e25fd => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.273 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.292 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.295 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.304 187132 INFO nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Took 6.81 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.304 187132 DEBUG nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.329 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.362 187132 INFO nova.compute.manager [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Took 7.32 seconds to build instance.#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.375 187132 DEBUG oslo_concurrency.lockutils [None req-ad87a3b4-5282-4651-ae78-f9425a4fc5a2 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.387s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:20 np0005554845 podman[223050]: 2025-12-11 06:17:20.614955852 +0000 UTC m=+0.056288779 container create a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, io.buildah.version=1.41.3)
Dec 11 01:17:20 np0005554845 systemd[1]: Started libpod-conmon-a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7.scope.
Dec 11 01:17:20 np0005554845 podman[223050]: 2025-12-11 06:17:20.583892159 +0000 UTC m=+0.025225116 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:17:20 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:17:20 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/940fbd2bf39ebc1b31a03b08859b25be3519dad5fcd8aa60fec108af0d4ceb43/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:17:20 np0005554845 podman[223050]: 2025-12-11 06:17:20.703195778 +0000 UTC m=+0.144528735 container init a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.705 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:20 np0005554845 podman[223050]: 2025-12-11 06:17:20.711127013 +0000 UTC m=+0.152459940 container start a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:17:20 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [NOTICE]   (223070) : New worker (223072) forked
Dec 11 01:17:20 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [NOTICE]   (223070) : Loading success.
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.914 187132 DEBUG nova.network.neutron [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updated VIF entry in instance network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.915 187132 DEBUG nova.network.neutron [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:20 np0005554845 nova_compute[187128]: 2025-12-11 06:17:20.933 187132 DEBUG oslo_concurrency.lockutils [req-3c083f25-c614-4e7d-a85a-2ae679e4d5f2 req-b01cf3d9-fc63-487a-9388-dba05a7612ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.347 187132 DEBUG nova.compute.manager [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.347 187132 DEBUG oslo_concurrency.lockutils [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.347 187132 DEBUG oslo_concurrency.lockutils [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.348 187132 DEBUG oslo_concurrency.lockutils [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.348 187132 DEBUG nova.compute.manager [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] No waiting events found dispatching network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:17:22 np0005554845 nova_compute[187128]: 2025-12-11 06:17:22.348 187132 WARNING nova.compute.manager [req-324facb9-2194-43bc-abdf-7da0fb6a9b97 req-20c0810a-b2d0-421b-aedd-29e34d471e83 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received unexpected event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:17:23 np0005554845 nova_compute[187128]: 2025-12-11 06:17:23.916 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:24 np0005554845 podman[223081]: 2025-12-11 06:17:24.129129766 +0000 UTC m=+0.054064870 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:17:24 np0005554845 nova_compute[187128]: 2025-12-11 06:17:24.420 187132 DEBUG nova.compute.manager [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:24 np0005554845 nova_compute[187128]: 2025-12-11 06:17:24.421 187132 DEBUG nova.compute.manager [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing instance network info cache due to event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:17:24 np0005554845 nova_compute[187128]: 2025-12-11 06:17:24.421 187132 DEBUG oslo_concurrency.lockutils [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:24 np0005554845 nova_compute[187128]: 2025-12-11 06:17:24.421 187132 DEBUG oslo_concurrency.lockutils [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:24 np0005554845 nova_compute[187128]: 2025-12-11 06:17:24.422 187132 DEBUG nova.network.neutron [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:17:25 np0005554845 nova_compute[187128]: 2025-12-11 06:17:25.727 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:26.230 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:26.231 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:26.232 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:27 np0005554845 nova_compute[187128]: 2025-12-11 06:17:27.136 187132 DEBUG nova.network.neutron [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updated VIF entry in instance network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:17:27 np0005554845 nova_compute[187128]: 2025-12-11 06:17:27.136 187132 DEBUG nova.network.neutron [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:27 np0005554845 nova_compute[187128]: 2025-12-11 06:17:27.474 187132 DEBUG oslo_concurrency.lockutils [req-ba86a9f1-8a9a-4eca-a94e-1e757fd41a08 req-179a6f17-dc5a-4344-8da7-37fbcfa1d88f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:28 np0005554845 nova_compute[187128]: 2025-12-11 06:17:28.923 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:29 np0005554845 nova_compute[187128]: 2025-12-11 06:17:29.395 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.103 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'name': 'tempest-TestGettingAddress-server-538885827', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002a', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'user_id': '60e9372de4754580913a836e11b9c248', 'hostId': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.105 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.119 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.120 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e858ab9f-d006-4aaf-99e3-0d2aa95481dc', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.105904', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '129d01a0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'eb4799692aec097b525137edb93d8f7c88bb8e301bd18c0287a26ab49063c414'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.105904', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '129d0efc-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'fdf19c4c1a015a84e6e02ba8741acac857cf0ecc494066d7ad1552520c34dac5'}]}, 'timestamp': '2025-12-11 06:17:30.120294', '_unique_id': '56fb14fcea88478798fcf4b4d1a0f643'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.122 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.123 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.129 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 45a86888-e6a3-42e0-a383-d78cdd0e25fd / tap0ce80b63-29 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.130 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9c729900-cdc1-4c01-a39b-395c2f49a68b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.123872', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '129ea866-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '467e0625e28a2ce546ac52653d2974acc530e58f1c88d5e393512438f6ed3e10'}]}, 'timestamp': '2025-12-11 06:17:30.130934', '_unique_id': '39f992de7a184aa09bc298ba30dfa013'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.132 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.133 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:17:30 np0005554845 podman[223108]: 2025-12-11 06:17:30.156217594 +0000 UTC m=+0.075753748 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_id=edpm, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.165 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.166 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '59a627b0-e9a7-4003-aabc-861137a9a131', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.133531', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a405f4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'adc216a2d4c61fd57aab02e20024506e7c37f808af12f63609746f45e7c3c459'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.133531', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a41364-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'b73bdb86233aadeb3be997694780db4d2be28627720ba7a217677f5bd241581f'}]}, 'timestamp': '2025-12-11 06:17:30.166278', '_unique_id': 'b504997cde5849f88228338d58d56f57'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.168 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.168 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.168 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>]
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.168 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.184 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/cpu volume: 9650000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6b22ea1d-b788-4614-8067-3d753a8e9422', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 9650000000, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'timestamp': '2025-12-11T06:17:30.169048', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '12a6e0b2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.885867077, 'message_signature': '1a1a22ccbb383626a8558fd942d22b9d1e58ba212c7265cd8d61142b9206e8aa'}]}, 'timestamp': '2025-12-11 06:17:30.184696', '_unique_id': 'aaeb296900a846c0b668d01d084c24ec'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.185 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.186 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.latency volume: 121836526 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.186 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.latency volume: 525474 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '856ecfb6-6d08-48a7-a300-13640db10f0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 121836526, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.186525', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a73436-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '3fa45cb5d822b88fe70b4c63f8ee236d24a7d480a99adca179d9bed705da32d0'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 525474, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.186525', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a73c88-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'b6912880b3fba09bd95296fdaa1abab9d8e18782708725b3ede279dbbb75cefe'}]}, 'timestamp': '2025-12-11 06:17:30.186964', '_unique_id': '4e58d693f39845809ebb7fe5206dce59'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>]
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.188 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4ab1b17-60cf-4c12-b97b-777bc57d41a3', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.188461', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a77f86-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '2a2be7f8a539c5addcc533cdd34df1e9cecc61b8acafd999f719ec4d8bfe4487'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.188461', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a787a6-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '3427dbb892870afdc534fac454c95c3c9bf235e6011565248d3a2a745fc07a43'}]}, 'timestamp': '2025-12-11 06:17:30.188886', '_unique_id': 'd1bdc10591ca42aead9a35c914a766e2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e0a95698-cfe0-47bf-8919-5fb104022a8f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.190168', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12a7c27a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': 'ab3981839191315ae3e91e5a93e33bbd4b41e53a4f3ab14be3d5c107a82001df'}]}, 'timestamp': '2025-12-11 06:17:30.190442', '_unique_id': '714760073cfe49b7a41e75240de6f02f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.190 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.191 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0c9116fd-ed06-4172-a6b7-3a8f00c70847', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.191536', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12a7f876-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '9954029a3e9c0ab1bdcb641f1f3db230865c179265ddae4449032dc273a8997e'}]}, 'timestamp': '2025-12-11 06:17:30.191793', '_unique_id': 'f1b4e07f63374bb69acd4272c239645c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.192 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '94ec70a4-1ce3-4895-a620-ea4f4729ed6c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.192845', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a82ab2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'e40061a9ab0a59bb8a09fab8bf865c95b4dcf7b4e0b89ad8c1d87ab0c633447c'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.192845', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a83430-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'fb2687834e621934e61266fb4f770f2ffb314b357a8f7a877c5722540b8b8f80'}]}, 'timestamp': '2025-12-11 06:17:30.193304', '_unique_id': 'deb54b00a5504ea5bfd533356e60ec9c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.194 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.194 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f43fd435-864c-4aba-ac1a-7609b7147c88', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.194467', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12a86bd0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': 'f6be8f2871828122f4a089ade3b513657696dc7dd014c4136ae00de38d131aca'}]}, 'timestamp': '2025-12-11 06:17:30.194764', '_unique_id': '01dd2170c9fe488580421eba53764d60'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.195 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.196 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>]
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.196 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '64914a44-13c6-4606-84c3-15c117ca5329', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.196279', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a8c59e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '1d6febd49ef8f0335adeb957a20a53fa90ede3d7581d7a334855960fe1beea3a'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.196279', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a8ce90-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'fccb386ca73ac79e65c5f98983fe065b91a90a0d6c19f4643a1debaabdc8313a'}]}, 'timestamp': '2025-12-11 06:17:30.197268', '_unique_id': 'e4f7fc271fd2402eaec8563f71baf08b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.198 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.198 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '285dd090-2f9c-4282-a518-1fc9ffd829eb', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.198360', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12a902d4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '7f2e9e068e7df9146ff1d2c6229a4f5e039d0ddf5509959c82706792f3bbac03'}]}, 'timestamp': '2025-12-11 06:17:30.198605', '_unique_id': 'aa90069548ac49d8b5984eb707f796f0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.199 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a0be9b43-6148-402e-82c5-3110c8a69472', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.199686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a93600-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '2245ce9a81256c02c6789d122c63393857436fcfe4524322322ff800b03d5ea3'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.199686', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a93d9e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'e2b9142dfc90011b14826c20dbc6366a093570724412a6afb54a2f53e6b2db8e'}]}, 'timestamp': '2025-12-11 06:17:30.200095', '_unique_id': 'ffd81eb69ea44c25be1e8a3c2f2fd19a'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.200 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.201 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.201 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.201 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'f9d52956-4272-467f-98b4-da7ec5925d53', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.201160', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a96f3a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'a2f4c2b418abbbb8dafd5c95316407e9492f4c9fb8c73508aba0f34f1f51b308'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.201160', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a9787c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.807657353, 'message_signature': 'd88322a3bd94bca40ede823e553903e9f1babda24cfe0f727a881791d6d0ed09'}]}, 'timestamp': '2025-12-11 06:17:30.201604', '_unique_id': '6d8f6139fd5c423c8ff366322efb8b81'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.202 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c9fd3a7-cd1a-4e60-a239-7a9b5ba7f262', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.202700', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12a9abee-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '4619ebdd043b349f0ddd3f54fc42b25bf64cc93b7065f0b2a203758b9ab654ef'}]}, 'timestamp': '2025-12-11 06:17:30.202936', '_unique_id': '413edf11236c4b73b5acc69e20b2b2d2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.203 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c97fc670-b756-408a-9785-6551db9a774c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-vda', 'timestamp': '2025-12-11T06:17:30.204043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '12a9e0be-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': 'f8b4437619f3b2ebbccd08ca0c737ed80e91a97b3aa579a76249ec3b53824b44'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd-sda', 'timestamp': '2025-12-11T06:17:30.204043', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'instance-0000002a', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '12a9e974-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.835257933, 'message_signature': '68c2dff5d6898b23be9e79c73c1129422ed843f70259ef065b4071e78e669bb1'}]}, 'timestamp': '2025-12-11 06:17:30.204497', '_unique_id': '775fa30e168b4301b9d63543f9345512'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.205 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.205 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-538885827>]
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b91b2804-6411-4592-9355-20f645875915', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.206109', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12aa323a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '100713d4a64c91b927d47bdcbc9aedffe53e1fc06ae324d4e46f066bb3e83b57'}]}, 'timestamp': '2025-12-11 06:17:30.206390', '_unique_id': '38db5951c31947dabb90dcc4d3723357'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.206 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.207 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.207 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9b07bbc7-f5a7-440e-8bd3-52dd08f74c93', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.207674', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12aa6de0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '5f2872f2339659fb04d0289ae99cc16a2205826daf235a1f9697d346fb628a40'}]}, 'timestamp': '2025-12-11 06:17:30.207899', '_unique_id': 'd0e49172f5484c82a050bf4ff9b22308'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.208 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 45a86888-e6a3-42e0-a383-d78cdd0e25fd: ceilometer.compute.pollsters.NoVolumeException
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e4bd91ac-d464-4970-aeba-69b8b7db44da', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.209197', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12aaa92c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': '5d06c23690a465df5aa65df5798d3c382c2956d46195e55317209fb8593a9111'}]}, 'timestamp': '2025-12-11 06:17:30.209459', '_unique_id': 'c7fc46811f9b48abafee677b592d4916'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.210 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.210 12 DEBUG ceilometer.compute.pollsters [-] 45a86888-e6a3-42e0-a383-d78cdd0e25fd/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6a7c41b1-ea1b-4c31-9c78-9acaee0af09b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-0000002a-45a86888-e6a3-42e0-a383-d78cdd0e25fd-tap0ce80b63-29', 'timestamp': '2025-12-11T06:17:30.210609', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-538885827', 'name': 'tap0ce80b63-29', 'instance_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:b9:7e:24', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap0ce80b63-29'}, 'message_id': '12aae090-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4124.825604211, 'message_signature': 'c67f17900ab27abd6edcef8208f989fd37dc3decd7746cff2675d37ec8ffc9cc'}]}, 'timestamp': '2025-12-11 06:17:30.210839', '_unique_id': '0166732c4605484ea18e1c2d025dade2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:17:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:17:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:17:30 np0005554845 nova_compute[187128]: 2025-12-11 06:17:30.730 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:31 np0005554845 nova_compute[187128]: 2025-12-11 06:17:31.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:32 np0005554845 podman[223142]: 2025-12-11 06:17:32.132523139 +0000 UTC m=+0.058168071 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:17:32 np0005554845 podman[223143]: 2025-12-11 06:17:32.159618334 +0000 UTC m=+0.085615596 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller)
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.719 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.808 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:32 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:32Z|00036|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:b9:7e:24 10.100.0.10
Dec 11 01:17:32 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:32Z|00037|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:b9:7e:24 10.100.0.10
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.908 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json" returned: 0 in 0.100s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.909 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:32 np0005554845 nova_compute[187128]: 2025-12-11 06:17:32.962 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.040 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.139 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.141 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5453MB free_disk=73.26394653320312GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.141 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.142 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.220 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 45a86888-e6a3-42e0-a383-d78cdd0e25fd actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.220 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.220 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.266 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.279 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.312 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.313 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:33 np0005554845 nova_compute[187128]: 2025-12-11 06:17:33.926 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:34 np0005554845 nova_compute[187128]: 2025-12-11 06:17:34.308 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:34 np0005554845 nova_compute[187128]: 2025-12-11 06:17:34.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:34 np0005554845 nova_compute[187128]: 2025-12-11 06:17:34.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:17:34 np0005554845 nova_compute[187128]: 2025-12-11 06:17:34.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:17:35 np0005554845 podman[223194]: 2025-12-11 06:17:35.144602708 +0000 UTC m=+0.075756408 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3)
Dec 11 01:17:35 np0005554845 nova_compute[187128]: 2025-12-11 06:17:35.383 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:35 np0005554845 nova_compute[187128]: 2025-12-11 06:17:35.384 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:35 np0005554845 nova_compute[187128]: 2025-12-11 06:17:35.384 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:17:35 np0005554845 nova_compute[187128]: 2025-12-11 06:17:35.384 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 45a86888-e6a3-42e0-a383-d78cdd0e25fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:35 np0005554845 nova_compute[187128]: 2025-12-11 06:17:35.730 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:38 np0005554845 nova_compute[187128]: 2025-12-11 06:17:38.929 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:40 np0005554845 podman[223215]: 2025-12-11 06:17:40.138139821 +0000 UTC m=+0.064861222 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, distribution-scope=public, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, config_id=edpm, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, build-date=2025-08-20T13:12:41, version=9.6, io.buildah.version=1.33.7, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, vendor=Red Hat, Inc.)
Dec 11 01:17:40 np0005554845 podman[223214]: 2025-12-11 06:17:40.146760056 +0000 UTC m=+0.068956204 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:17:40 np0005554845 nova_compute[187128]: 2025-12-11 06:17:40.775 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.945 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.963 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.964 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.964 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.965 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.965 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.965 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:17:41 np0005554845 nova_compute[187128]: 2025-12-11 06:17:41.965 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.017 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.018 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.035 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.118 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.119 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.125 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.125 187132 INFO nova.compute.claims [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.244 187132 DEBUG nova.compute.provider_tree [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.261 187132 DEBUG nova.scheduler.client.report [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.290 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.290 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.332 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.333 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.381 187132 INFO nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.401 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.494 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.496 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.497 187132 INFO nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Creating image(s)#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.498 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.498 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.499 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.511 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.570 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.571 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.571 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.593 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.656 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.657 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.675 187132 DEBUG nova.policy [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '9a0a63046d2f4ebd819e7c5bb47d172b', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '1528e0ec9d214424bcb218bb466f693e', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.694 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk 1073741824" returned: 0 in 0.037s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.694 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.700 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.764 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.766 187132 DEBUG nova.virt.disk.api [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Checking if we can resize image /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.767 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.827 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.828 187132 DEBUG nova.virt.disk.api [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Cannot resize image /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.829 187132 DEBUG nova.objects.instance [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lazy-loading 'migration_context' on Instance uuid 580cacf3-199a-4e28-a146-69ccaa92a8b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.842 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.843 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Ensure instance console log exists: /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.843 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.844 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:42 np0005554845 nova_compute[187128]: 2025-12-11 06:17:42.844 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:43.260 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=16, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=15) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:17:43 np0005554845 nova_compute[187128]: 2025-12-11 06:17:43.260 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:43.262 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 4 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:17:43 np0005554845 nova_compute[187128]: 2025-12-11 06:17:43.437 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Successfully created port: 476816b4-a4da-4bc1-b013-60c522330ded _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:17:43 np0005554845 nova_compute[187128]: 2025-12-11 06:17:43.932 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.801 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Successfully updated port: 476816b4-a4da-4bc1-b013-60c522330ded _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.819 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.820 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquired lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.820 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.900 187132 DEBUG nova.compute.manager [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-changed-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.900 187132 DEBUG nova.compute.manager [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing instance network info cache due to event network-changed-476816b4-a4da-4bc1-b013-60c522330ded. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.900 187132 DEBUG oslo_concurrency.lockutils [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:44 np0005554845 nova_compute[187128]: 2025-12-11 06:17:44.967 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.651 187132 DEBUG nova.network.neutron [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updating instance_info_cache with network_info: [{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.672 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Releasing lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.673 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Instance network_info: |[{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.674 187132 DEBUG oslo_concurrency.lockutils [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.674 187132 DEBUG nova.network.neutron [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.680 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Start _get_guest_xml network_info=[{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.687 187132 WARNING nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.694 187132 DEBUG nova.virt.libvirt.host [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.695 187132 DEBUG nova.virt.libvirt.host [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.702 187132 DEBUG nova.virt.libvirt.host [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.703 187132 DEBUG nova.virt.libvirt.host [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.704 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.705 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.705 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.705 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.705 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.706 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.706 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.706 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.706 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.706 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.707 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.707 187132 DEBUG nova.virt.hardware [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.710 187132 DEBUG nova.virt.libvirt.vif [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-114010906-acc',id=43,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFOUpkBQsgNGSTfE8jhsAMU6fxi5Kjw6Z1gMY9XQ8r03QMNlMamVtgRMYVNipMdgs1FMO/ad17xEAQMBEVTnz53Hvp0T8PC2ZHyqlM+ScRPrKadCyZsofGod82ACxvw2w==',key_name='tempest-TestSecurityGroupsBasicOps-1619969181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1528e0ec9d214424bcb218bb466f693e',ramdisk_id='',reservation_id='r-zxtpcn89',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-114010906',owner_user_name='tempest-TestSecurityGroupsBasicOps-114010906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:17:42Z,user_data=None,user_id='9a0a63046d2f4ebd819e7c5bb47d172b',uuid=580cacf3-199a-4e28-a146-69ccaa92a8b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.710 187132 DEBUG nova.network.os_vif_util [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converting VIF {"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.711 187132 DEBUG nova.network.os_vif_util [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.712 187132 DEBUG nova.objects.instance [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lazy-loading 'pci_devices' on Instance uuid 580cacf3-199a-4e28-a146-69ccaa92a8b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.742 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <uuid>580cacf3-199a-4e28-a146-69ccaa92a8b9</uuid>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <name>instance-0000002b</name>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044</nova:name>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:17:45</nova:creationTime>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:user uuid="9a0a63046d2f4ebd819e7c5bb47d172b">tempest-TestSecurityGroupsBasicOps-114010906-project-member</nova:user>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:project uuid="1528e0ec9d214424bcb218bb466f693e">tempest-TestSecurityGroupsBasicOps-114010906</nova:project>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        <nova:port uuid="476816b4-a4da-4bc1-b013-60c522330ded">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.12" ipVersion="4"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="serial">580cacf3-199a-4e28-a146-69ccaa92a8b9</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="uuid">580cacf3-199a-4e28-a146-69ccaa92a8b9</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.config"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:9b:b7:3e"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <target dev="tap476816b4-a4"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/console.log" append="off"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:17:45 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:17:45 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:17:45 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:17:45 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.744 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Preparing to wait for external event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.744 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.745 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.745 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.747 187132 DEBUG nova.virt.libvirt.vif [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-114010906-acc',id=43,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFOUpkBQsgNGSTfE8jhsAMU6fxi5Kjw6Z1gMY9XQ8r03QMNlMamVtgRMYVNipMdgs1FMO/ad17xEAQMBEVTnz53Hvp0T8PC2ZHyqlM+ScRPrKadCyZsofGod82ACxvw2w==',key_name='tempest-TestSecurityGroupsBasicOps-1619969181',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='1528e0ec9d214424bcb218bb466f693e',ramdisk_id='',reservation_id='r-zxtpcn89',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-114010906',owner_user_name='tempest-TestSecurityGroupsBasicOps-114010906-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:17:42Z,user_data=None,user_id='9a0a63046d2f4ebd819e7c5bb47d172b',uuid=580cacf3-199a-4e28-a146-69ccaa92a8b9,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.748 187132 DEBUG nova.network.os_vif_util [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converting VIF {"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.749 187132 DEBUG nova.network.os_vif_util [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.750 187132 DEBUG os_vif [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.751 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.752 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.752 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.757 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.757 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap476816b4-a4, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.758 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap476816b4-a4, col_values=(('external_ids', {'iface-id': '476816b4-a4da-4bc1-b013-60c522330ded', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:9b:b7:3e', 'vm-uuid': '580cacf3-199a-4e28-a146-69ccaa92a8b9'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.760 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:45 np0005554845 NetworkManager[55529]: <info>  [1765433865.7625] manager: (tap476816b4-a4): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/142)
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.764 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.769 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.771 187132 INFO os_vif [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4')#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.778 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.826 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.826 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.826 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] No VIF found with MAC fa:16:3e:9b:b7:3e, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:17:45 np0005554845 nova_compute[187128]: 2025-12-11 06:17:45.827 187132 INFO nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Using config drive#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.289 187132 INFO nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Creating config drive at /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.config#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.295 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_fq_m8ta execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.420 187132 DEBUG oslo_concurrency.processutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_fq_m8ta" returned: 0 in 0.125s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:17:46 np0005554845 kernel: tap476816b4-a4: entered promiscuous mode
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.4750] manager: (tap476816b4-a4): new Tun device (/org/freedesktop/NetworkManager/Devices/143)
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.475 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:46Z|00276|binding|INFO|Claiming lport 476816b4-a4da-4bc1-b013-60c522330ded for this chassis.
Dec 11 01:17:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:46Z|00277|binding|INFO|476816b4-a4da-4bc1-b013-60c522330ded: Claiming fa:16:3e:9b:b7:3e 10.100.0.12
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.483 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:b7:3e 10.100.0.12'], port_security=['fa:16:3e:9b:b7:3e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '580cacf3-199a-4e28-a146-69ccaa92a8b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51af992f-1722-4ed3-91ad-881c098813ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1528e0ec9d214424bcb218bb466f693e', 'neutron:revision_number': '2', 'neutron:security_group_ids': '3c94c903-7793-49a0-95ae-65ecc283c877 ce427123-4e34-414b-9f66-5c7ec60aeb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be5a8680-d4ab-469c-8663-1e85daaaa23a, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=476816b4-a4da-4bc1-b013-60c522330ded) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.484 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 476816b4-a4da-4bc1-b013-60c522330ded in datapath 51af992f-1722-4ed3-91ad-881c098813ea bound to our chassis#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.486 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 51af992f-1722-4ed3-91ad-881c098813ea#033[00m
Dec 11 01:17:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:46Z|00278|binding|INFO|Setting lport 476816b4-a4da-4bc1-b013-60c522330ded ovn-installed in OVS
Dec 11 01:17:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:46Z|00279|binding|INFO|Setting lport 476816b4-a4da-4bc1-b013-60c522330ded up in Southbound
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.491 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.496 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.503 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4ee71d95-a75e-460a-acdc-0dfb7d178951]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.504 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap51af992f-11 in ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.506 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap51af992f-10 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.507 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[01e43a0e-cf0d-4579-9094-1a26d40440be]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 systemd-udevd[223294]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.507 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b861330c-a62f-4f79-ad38-91a0b5ce8164]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 systemd-machined[153381]: New machine qemu-21-instance-0000002b.
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.519 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[c90564a8-1e2f-45a1-9052-0d932bea7707]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.5248] device (tap476816b4-a4): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.5263] device (tap476816b4-a4): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:17:46 np0005554845 systemd[1]: Started Virtual Machine qemu-21-instance-0000002b.
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.551 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b49e94fc-7f8d-4232-b501-aba549b29fb3]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.588 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[77646e12-a688-40e4-af6f-824ce6725065]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.593 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[33b30bae-e970-48db-a93f-59fa2f015857]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.5950] manager: (tap51af992f-10): new Veth device (/org/freedesktop/NetworkManager/Devices/144)
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.633 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[465034ec-457c-408d-b5d2-cb2cd6486a86]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.638 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[8223b08d-069b-45e9-8b75-c27463f4f7b2]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.6664] device (tap51af992f-10): carrier: link connected
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.673 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[31540b4c-2978-4655-94bd-de8c102c02d9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.695 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[04dc9337-9c7a-44ae-a2ce-5f1a0181c169]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51af992f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:42:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414131, 'reachable_time': 35784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223327, 'error': None, 'target': 'ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.715 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1966b246-8427-405f-907d-c4622e59bd73]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fe59:4255'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 414131, 'tstamp': 414131}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 223328, 'error': None, 'target': 'ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.736 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[11093342-673f-4bb7-89ff-dcd6193118cc]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap51af992f-11'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:59:42:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 84], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414131, 'reachable_time': 35784, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 223329, 'error': None, 'target': 'ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.742 187132 DEBUG nova.compute.manager [req-a94eeae9-cb35-406f-9a7d-edc30511d5a5 req-23036b11-f0f7-4037-8ab5-3c3e39a5a621 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.743 187132 DEBUG oslo_concurrency.lockutils [req-a94eeae9-cb35-406f-9a7d-edc30511d5a5 req-23036b11-f0f7-4037-8ab5-3c3e39a5a621 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.743 187132 DEBUG oslo_concurrency.lockutils [req-a94eeae9-cb35-406f-9a7d-edc30511d5a5 req-23036b11-f0f7-4037-8ab5-3c3e39a5a621 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.743 187132 DEBUG oslo_concurrency.lockutils [req-a94eeae9-cb35-406f-9a7d-edc30511d5a5 req-23036b11-f0f7-4037-8ab5-3c3e39a5a621 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.744 187132 DEBUG nova.compute.manager [req-a94eeae9-cb35-406f-9a7d-edc30511d5a5 req-23036b11-f0f7-4037-8ab5-3c3e39a5a621 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Processing event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.774 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5bfcfc9f-44ea-4525-ad3a-b907b9be621f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.834 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[35dbb2f7-fbd2-4dd8-a416-26ed8e5d5333]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.836 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51af992f-10, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.836 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.837 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap51af992f-10, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 NetworkManager[55529]: <info>  [1765433866.8398] manager: (tap51af992f-10): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/145)
Dec 11 01:17:46 np0005554845 kernel: tap51af992f-10: entered promiscuous mode
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.843 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.844 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap51af992f-10, col_values=(('external_ids', {'iface-id': '70d781d2-de50-454e-b2ff-f95908ca3d6a'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.844 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.846 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.847 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/51af992f-1722-4ed3-91ad-881c098813ea.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/51af992f-1722-4ed3-91ad-881c098813ea.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:17:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:46Z|00280|binding|INFO|Releasing lport 70d781d2-de50-454e-b2ff-f95908ca3d6a from this chassis (sb_readonly=0)
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.847 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2959bdd6-aa29-4c08-b7d3-44d5db77595c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.848 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-51af992f-1722-4ed3-91ad-881c098813ea
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/51af992f-1722-4ed3-91ad-881c098813ea.pid.haproxy
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 51af992f-1722-4ed3-91ad-881c098813ea
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:17:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:46.848 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea', 'env', 'PROCESS_TAG=haproxy-51af992f-1722-4ed3-91ad-881c098813ea', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/51af992f-1722-4ed3-91ad-881c098813ea.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.867 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.948 187132 DEBUG nova.network.neutron [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updated VIF entry in instance network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.949 187132 DEBUG nova.network.neutron [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updating instance_info_cache with network_info: [{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:46 np0005554845 nova_compute[187128]: 2025-12-11 06:17:46.964 187132 DEBUG oslo_concurrency.lockutils [req-1856f6b9-5c7e-42da-9608-02c931300e8f req-ad8de604-fad1-4b7a-901e-b85afd937dcc eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:47 np0005554845 podman[223363]: 2025-12-11 06:17:47.223364082 +0000 UTC m=+0.064292347 container create 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.244 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433867.2435753, 580cacf3-199a-4e28-a146-69ccaa92a8b9 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.245 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] VM Started (Lifecycle Event)#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.248 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.252 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.256 187132 INFO nova.virt.libvirt.driver [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Instance spawned successfully.#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.257 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:17:47 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:47.263 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '16'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.265 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.269 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:17:47 np0005554845 systemd[1]: Started libpod-conmon-48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8.scope.
Dec 11 01:17:47 np0005554845 podman[223363]: 2025-12-11 06:17:47.185436092 +0000 UTC m=+0.026364377 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:17:47 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:17:47 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/abf026504a21dc9c8c3155e767d6e0b8552a3d087a9c4a1633900303e1f54f47/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:17:47 np0005554845 podman[223363]: 2025-12-11 06:17:47.337291335 +0000 UTC m=+0.178219620 container init 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 01:17:47 np0005554845 podman[223363]: 2025-12-11 06:17:47.345128489 +0000 UTC m=+0.186056754 container start 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 01:17:47 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [NOTICE]   (223388) : New worker (223390) forked
Dec 11 01:17:47 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [NOTICE]   (223388) : Loading success.
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.908 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.909 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433867.2448673, 580cacf3-199a-4e28-a146-69ccaa92a8b9 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.910 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.914 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.915 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.915 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.915 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.916 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.916 187132 DEBUG nova.virt.libvirt.driver [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.944 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.948 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433867.2516704, 580cacf3-199a-4e28-a146-69ccaa92a8b9 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.948 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.967 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.970 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:17:47 np0005554845 nova_compute[187128]: 2025-12-11 06:17:47.990 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.024 187132 INFO nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Took 5.53 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.025 187132 DEBUG nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.162 187132 INFO nova.compute.manager [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Took 6.07 seconds to build instance.#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.188 187132 DEBUG oslo_concurrency.lockutils [None req-0ded275b-9a3e-43ee-81d4-6c4b3e32c0f7 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 6.170s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.829 187132 DEBUG nova.compute.manager [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.830 187132 DEBUG oslo_concurrency.lockutils [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.830 187132 DEBUG oslo_concurrency.lockutils [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.831 187132 DEBUG oslo_concurrency.lockutils [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.831 187132 DEBUG nova.compute.manager [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] No waiting events found dispatching network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:17:48 np0005554845 nova_compute[187128]: 2025-12-11 06:17:48.831 187132 WARNING nova.compute.manager [req-aa55154b-bb5d-4e4e-a3ad-fa2842921ef1 req-3137f422-b539-4857-833f-cf4494e7827d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received unexpected event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded for instance with vm_state active and task_state None.#033[00m
Dec 11 01:17:50 np0005554845 nova_compute[187128]: 2025-12-11 06:17:50.761 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:50 np0005554845 nova_compute[187128]: 2025-12-11 06:17:50.781 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:53 np0005554845 nova_compute[187128]: 2025-12-11 06:17:53.798 187132 DEBUG nova.compute.manager [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-changed-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:53 np0005554845 nova_compute[187128]: 2025-12-11 06:17:53.799 187132 DEBUG nova.compute.manager [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing instance network info cache due to event network-changed-476816b4-a4da-4bc1-b013-60c522330ded. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:17:53 np0005554845 nova_compute[187128]: 2025-12-11 06:17:53.799 187132 DEBUG oslo_concurrency.lockutils [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:53 np0005554845 nova_compute[187128]: 2025-12-11 06:17:53.800 187132 DEBUG oslo_concurrency.lockutils [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:53 np0005554845 nova_compute[187128]: 2025-12-11 06:17:53.800 187132 DEBUG nova.network.neutron [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:17:55 np0005554845 podman[223399]: 2025-12-11 06:17:55.12936053 +0000 UTC m=+0.056619008 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.783 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 4999-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.786 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.787 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: idle 5005 ms, sending inactivity probe run /usr/lib64/python3.9/site-packages/ovs/reconnect.py:117#033[00m
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.787 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering IDLE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.821 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:55 np0005554845 nova_compute[187128]: 2025-12-11 06:17:55.822 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] tcp:127.0.0.1:6640: entering ACTIVE _transition /usr/lib64/python3.9/site-packages/ovs/reconnect.py:519#033[00m
Dec 11 01:17:56 np0005554845 nova_compute[187128]: 2025-12-11 06:17:56.806 187132 DEBUG nova.network.neutron [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updated VIF entry in instance network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:17:56 np0005554845 nova_compute[187128]: 2025-12-11 06:17:56.808 187132 DEBUG nova.network.neutron [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updating instance_info_cache with network_info: [{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:17:56 np0005554845 nova_compute[187128]: 2025-12-11 06:17:56.831 187132 DEBUG oslo_concurrency.lockutils [req-54690047-3f14-4055-b99f-fa1a7aa5f364 req-2fc69f0d-64a3-48f7-b6b9-eb8909373a60 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.537 187132 DEBUG nova.compute.manager [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.538 187132 DEBUG nova.compute.manager [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing instance network info cache due to event network-changed-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.538 187132 DEBUG oslo_concurrency.lockutils [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.539 187132 DEBUG oslo_concurrency.lockutils [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.539 187132 DEBUG nova.network.neutron [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Refreshing network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.591 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.592 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.592 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.592 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.593 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.594 187132 INFO nova.compute.manager [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Terminating instance#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.595 187132 DEBUG nova.compute.manager [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:17:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:59Z|00038|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:9b:b7:3e 10.100.0.12
Dec 11 01:17:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:59Z|00039|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:9b:b7:3e 10.100.0.12
Dec 11 01:17:59 np0005554845 kernel: tap0ce80b63-29 (unregistering): left promiscuous mode
Dec 11 01:17:59 np0005554845 NetworkManager[55529]: <info>  [1765433879.6213] device (tap0ce80b63-29): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.628 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:59Z|00281|binding|INFO|Releasing lport 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 from this chassis (sb_readonly=0)
Dec 11 01:17:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:59Z|00282|binding|INFO|Setting lport 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 down in Southbound
Dec 11 01:17:59 np0005554845 ovn_controller[95428]: 2025-12-11T06:17:59Z|00283|binding|INFO|Removing iface tap0ce80b63-29 ovn-installed in OVS
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.631 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.643 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b9:7e:24 10.100.0.10 2001:db8:0:1:f816:3eff:feb9:7e24 2001:db8::f816:3eff:feb9:7e24'], port_security=['fa:16:3e:b9:7e:24 10.100.0.10 2001:db8:0:1:f816:3eff:feb9:7e24 2001:db8::f816:3eff:feb9:7e24'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.10/28 2001:db8:0:1:f816:3eff:feb9:7e24/64 2001:db8::f816:3eff:feb9:7e24/64', 'neutron:device_id': '45a86888-e6a3-42e0-a383-d78cdd0e25fd', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2a302d93-d825-40ba-a363-74d1dc48857e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=078ed33d-3a39-4095-bb26-184c0c14abff, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.644 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 in datapath e8f46fd3-4213-49d6-9445-d5868c7b20f6 unbound from our chassis#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.646 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network e8f46fd3-4213-49d6-9445-d5868c7b20f6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.647 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.647 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fd385468-b94d-4795-85f3-672abe74f28d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.649 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6 namespace which is not needed anymore#033[00m
Dec 11 01:17:59 np0005554845 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002a.scope: Deactivated successfully.
Dec 11 01:17:59 np0005554845 systemd[1]: machine-qemu\x2d20\x2dinstance\x2d0000002a.scope: Consumed 14.714s CPU time.
Dec 11 01:17:59 np0005554845 systemd-machined[153381]: Machine qemu-20-instance-0000002a terminated.
Dec 11 01:17:59 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [NOTICE]   (223070) : haproxy version is 2.8.14-c23fe91
Dec 11 01:17:59 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [NOTICE]   (223070) : path to executable is /usr/sbin/haproxy
Dec 11 01:17:59 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [WARNING]  (223070) : Exiting Master process...
Dec 11 01:17:59 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [ALERT]    (223070) : Current worker (223072) exited with code 143 (Terminated)
Dec 11 01:17:59 np0005554845 neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6[223066]: [WARNING]  (223070) : All workers exited. Exiting... (0)
Dec 11 01:17:59 np0005554845 systemd[1]: libpod-a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7.scope: Deactivated successfully.
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.821 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 podman[223462]: 2025-12-11 06:17:59.825003475 +0000 UTC m=+0.070224977 container died a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.826 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.856 187132 DEBUG nova.compute.manager [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-unplugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.859 187132 DEBUG oslo_concurrency.lockutils [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.859 187132 DEBUG oslo_concurrency.lockutils [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.860 187132 DEBUG oslo_concurrency.lockutils [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.860 187132 DEBUG nova.compute.manager [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] No waiting events found dispatching network-vif-unplugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.860 187132 DEBUG nova.compute.manager [req-7dc9f232-dd86-439c-9553-6b0997749614 req-4dd7264c-e440-4515-bb8b-9462e6e43a2a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-unplugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:17:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7-userdata-shm.mount: Deactivated successfully.
Dec 11 01:17:59 np0005554845 systemd[1]: var-lib-containers-storage-overlay-940fbd2bf39ebc1b31a03b08859b25be3519dad5fcd8aa60fec108af0d4ceb43-merged.mount: Deactivated successfully.
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.872 187132 INFO nova.virt.libvirt.driver [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Instance destroyed successfully.#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.872 187132 DEBUG nova.objects.instance [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid 45a86888-e6a3-42e0-a383-d78cdd0e25fd obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:17:59 np0005554845 podman[223462]: 2025-12-11 06:17:59.879213208 +0000 UTC m=+0.124434690 container cleanup a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS)
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.887 187132 DEBUG nova.virt.libvirt.vif [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:17:11Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-538885827',display_name='tempest-TestGettingAddress-server-538885827',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-538885827',id=42,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBGjbLyg1YoDhHRFVFNs2iD0sgE6iKgmDCYJJhk9ForuYdpOT3u1ErL+6vfB0W0+wJQ87rVKekA4NcoJilcp+eACrmScyyWa4ZYGGh/WHn+bQNFNeaoBE9WbIGywRdxMdug==',key_name='tempest-TestGettingAddress-178936991',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:17:20Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-d7equdrc',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:17:20Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=45a86888-e6a3-42e0-a383-d78cdd0e25fd,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.888 187132 DEBUG nova.network.os_vif_util [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.213", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.889 187132 DEBUG nova.network.os_vif_util [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.889 187132 DEBUG os_vif [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.891 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.892 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0ce80b63-29, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.893 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.894 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:17:59 np0005554845 systemd[1]: libpod-conmon-a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7.scope: Deactivated successfully.
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.896 187132 INFO os_vif [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:b9:7e:24,bridge_name='br-int',has_traffic_filtering=True,id=0ce80b63-298a-4ec4-9d0a-8a7632ca6b57,network=Network(e8f46fd3-4213-49d6-9445-d5868c7b20f6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap0ce80b63-29')#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.897 187132 INFO nova.virt.libvirt.driver [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Deleting instance files /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd_del#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.898 187132 INFO nova.virt.libvirt.driver [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Deletion of /var/lib/nova/instances/45a86888-e6a3-42e0-a383-d78cdd0e25fd_del complete#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.956 187132 INFO nova.compute.manager [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.956 187132 DEBUG oslo.service.loopingcall [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.957 187132 DEBUG nova.compute.manager [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.957 187132 DEBUG nova.network.neutron [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:17:59 np0005554845 podman[223509]: 2025-12-11 06:17:59.962814527 +0000 UTC m=+0.057142892 container remove a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.971 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e1a485c3-f53e-40d6-91db-96c2adc85c6c]: (4, ('Thu Dec 11 06:17:59 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6 (a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7)\na39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7\nThu Dec 11 06:17:59 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6 (a39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7)\na39ea7ac682d2494840fa5a9ebe757aedcb29baba70ebcc59c7928658a7428b7\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.973 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[df8c6e21-40b8-4342-bffd-12f79d3baea9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.975 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape8f46fd3-40, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.978 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 kernel: tape8f46fd3-40: left promiscuous mode
Dec 11 01:17:59 np0005554845 nova_compute[187128]: 2025-12-11 06:17:59.989 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:17:59 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:17:59.994 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[dd402aca-d865-464d-adf3-7c31a1c0ec32]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:00.013 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ef44e7a9-27d2-451c-b564-a089fed3247c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:00.014 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1222f35f-3e6d-445d-b0b9-8509e6cfd8c6]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:00.034 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[58d96bc1-9290-499b-9f56-6722a7393cad]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 411456, 'reachable_time': 22340, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223524, 'error': None, 'target': 'ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:00 np0005554845 systemd[1]: run-netns-ovnmeta\x2de8f46fd3\x2d4213\x2d49d6\x2d9445\x2dd5868c7b20f6.mount: Deactivated successfully.
Dec 11 01:18:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:00.041 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-e8f46fd3-4213-49d6-9445-d5868c7b20f6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:18:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:00.041 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[3905e634-af38-4fd9-a5ed-ce0b525dbb73]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:00 np0005554845 nova_compute[187128]: 2025-12-11 06:18:00.823 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:01 np0005554845 podman[223525]: 2025-12-11 06:18:01.142754347 +0000 UTC m=+0.070068923 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.982 187132 DEBUG nova.compute.manager [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.983 187132 DEBUG oslo_concurrency.lockutils [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.984 187132 DEBUG oslo_concurrency.lockutils [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.984 187132 DEBUG oslo_concurrency.lockutils [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.985 187132 DEBUG nova.compute.manager [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] No waiting events found dispatching network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:18:01 np0005554845 nova_compute[187128]: 2025-12-11 06:18:01.985 187132 WARNING nova.compute.manager [req-732ec7da-3e4d-48cc-9648-b78c1335c761 req-97a60c4a-5d51-4a14-8b33-2d1a6c5d12b4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received unexpected event network-vif-plugged-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.039 187132 DEBUG nova.network.neutron [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.054 187132 INFO nova.compute.manager [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Took 2.10 seconds to deallocate network for instance.#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.102 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.103 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.188 187132 DEBUG nova.compute.provider_tree [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.208 187132 DEBUG nova.scheduler.client.report [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.230 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.126s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.254 187132 INFO nova.scheduler.client.report [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance 45a86888-e6a3-42e0-a383-d78cdd0e25fd#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.317 187132 DEBUG oslo_concurrency.lockutils [None req-1333339f-2fab-486e-9f7d-74fd08d5edc6 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "45a86888-e6a3-42e0-a383-d78cdd0e25fd" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.725s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.402 187132 DEBUG nova.network.neutron [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updated VIF entry in instance network info cache for port 0ce80b63-298a-4ec4-9d0a-8a7632ca6b57. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.402 187132 DEBUG nova.network.neutron [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Updating instance_info_cache with network_info: [{"id": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "address": "fa:16:3e:b9:7e:24", "network": {"id": "e8f46fd3-4213-49d6-9445-d5868c7b20f6", "bridge": "br-int", "label": "tempest-network-smoke--2075888528", "subnets": [{"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.10", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:feb9:7e24", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap0ce80b63-29", "ovs_interfaceid": "0ce80b63-298a-4ec4-9d0a-8a7632ca6b57", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:18:02 np0005554845 nova_compute[187128]: 2025-12-11 06:18:02.420 187132 DEBUG oslo_concurrency.lockutils [req-0e724cad-cbb5-4aa8-baff-1ebe6d9e5729 req-1c1ebaeb-4cb9-4837-9bef-47b342aa6fcb eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-45a86888-e6a3-42e0-a383-d78cdd0e25fd" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:18:03 np0005554845 podman[223544]: 2025-12-11 06:18:03.126314359 +0000 UTC m=+0.059406995 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent)
Dec 11 01:18:03 np0005554845 podman[223545]: 2025-12-11 06:18:03.172096472 +0000 UTC m=+0.096848041 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:18:04 np0005554845 nova_compute[187128]: 2025-12-11 06:18:04.060 187132 DEBUG nova.compute.manager [req-c85a6676-0e8e-4d22-9f50-f8b80faa2924 req-7c76be0d-e582-4129-a973-9f8fba3e3067 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Received event network-vif-deleted-0ce80b63-298a-4ec4-9d0a-8a7632ca6b57 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:04 np0005554845 nova_compute[187128]: 2025-12-11 06:18:04.893 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:05 np0005554845 nova_compute[187128]: 2025-12-11 06:18:05.825 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:06 np0005554845 podman[223590]: 2025-12-11 06:18:06.128482319 +0000 UTC m=+0.056294050 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:18:09 np0005554845 nova_compute[187128]: 2025-12-11 06:18:09.896 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:10 np0005554845 nova_compute[187128]: 2025-12-11 06:18:10.828 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:11 np0005554845 podman[223611]: 2025-12-11 06:18:11.107685564 +0000 UTC m=+0.043658526 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:18:11 np0005554845 podman[223612]: 2025-12-11 06:18:11.132405486 +0000 UTC m=+0.062912409 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, config_id=edpm, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, maintainer=Red Hat, Inc., architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, build-date=2025-08-20T13:12:41, release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, vcs-type=git, container_name=openstack_network_exporter)
Dec 11 01:18:14 np0005554845 nova_compute[187128]: 2025-12-11 06:18:14.871 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433879.869995, 45a86888-e6a3-42e0-a383-d78cdd0e25fd => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:18:14 np0005554845 nova_compute[187128]: 2025-12-11 06:18:14.872 187132 INFO nova.compute.manager [-] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:18:14 np0005554845 nova_compute[187128]: 2025-12-11 06:18:14.897 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:14 np0005554845 nova_compute[187128]: 2025-12-11 06:18:14.910 187132 DEBUG nova.compute.manager [None req-73d1ee23-ab04-4fac-8fc0-fc4795f294e4 - - - - - -] [instance: 45a86888-e6a3-42e0-a383-d78cdd0e25fd] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:18:15 np0005554845 nova_compute[187128]: 2025-12-11 06:18:15.830 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:18:16Z|00284|binding|INFO|Releasing lport 70d781d2-de50-454e-b2ff-f95908ca3d6a from this chassis (sb_readonly=0)
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.297 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:18:16Z|00285|binding|INFO|Releasing lport 70d781d2-de50-454e-b2ff-f95908ca3d6a from this chassis (sb_readonly=0)
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.442 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.487 187132 DEBUG nova.compute.manager [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-changed-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.488 187132 DEBUG nova.compute.manager [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing instance network info cache due to event network-changed-476816b4-a4da-4bc1-b013-60c522330ded. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.488 187132 DEBUG oslo_concurrency.lockutils [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.488 187132 DEBUG oslo_concurrency.lockutils [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.488 187132 DEBUG nova.network.neutron [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Refreshing network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.565 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.566 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.567 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.567 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.568 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.570 187132 INFO nova.compute.manager [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Terminating instance#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.571 187132 DEBUG nova.compute.manager [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:18:16 np0005554845 kernel: tap476816b4-a4 (unregistering): left promiscuous mode
Dec 11 01:18:16 np0005554845 NetworkManager[55529]: <info>  [1765433896.5970] device (tap476816b4-a4): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:18:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:18:16Z|00286|binding|INFO|Releasing lport 476816b4-a4da-4bc1-b013-60c522330ded from this chassis (sb_readonly=0)
Dec 11 01:18:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:18:16Z|00287|binding|INFO|Setting lport 476816b4-a4da-4bc1-b013-60c522330ded down in Southbound
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.612 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 ovn_controller[95428]: 2025-12-11T06:18:16Z|00288|binding|INFO|Removing iface tap476816b4-a4 ovn-installed in OVS
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.621 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:9b:b7:3e 10.100.0.12'], port_security=['fa:16:3e:9b:b7:3e 10.100.0.12'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.12/28', 'neutron:device_id': '580cacf3-199a-4e28-a146-69ccaa92a8b9', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-51af992f-1722-4ed3-91ad-881c098813ea', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '1528e0ec9d214424bcb218bb466f693e', 'neutron:revision_number': '4', 'neutron:security_group_ids': '3c94c903-7793-49a0-95ae-65ecc283c877 ce427123-4e34-414b-9f66-5c7ec60aeb2a', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=be5a8680-d4ab-469c-8663-1e85daaaa23a, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=476816b4-a4da-4bc1-b013-60c522330ded) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.623 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 476816b4-a4da-4bc1-b013-60c522330ded in datapath 51af992f-1722-4ed3-91ad-881c098813ea unbound from our chassis#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.624 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 51af992f-1722-4ed3-91ad-881c098813ea, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.625 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[d3144481-fc08-4909-b0a8-0ab7688fea0e]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.626 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea namespace which is not needed anymore#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.642 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002b.scope: Deactivated successfully.
Dec 11 01:18:16 np0005554845 systemd[1]: machine-qemu\x2d21\x2dinstance\x2d0000002b.scope: Consumed 13.663s CPU time.
Dec 11 01:18:16 np0005554845 systemd-machined[153381]: Machine qemu-21-instance-0000002b terminated.
Dec 11 01:18:16 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [NOTICE]   (223388) : haproxy version is 2.8.14-c23fe91
Dec 11 01:18:16 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [NOTICE]   (223388) : path to executable is /usr/sbin/haproxy
Dec 11 01:18:16 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [WARNING]  (223388) : Exiting Master process...
Dec 11 01:18:16 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [ALERT]    (223388) : Current worker (223390) exited with code 143 (Terminated)
Dec 11 01:18:16 np0005554845 neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea[223384]: [WARNING]  (223388) : All workers exited. Exiting... (0)
Dec 11 01:18:16 np0005554845 systemd[1]: libpod-48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8.scope: Deactivated successfully.
Dec 11 01:18:16 np0005554845 podman[223678]: 2025-12-11 06:18:16.780426881 +0000 UTC m=+0.054734988 container died 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:18:16 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8-userdata-shm.mount: Deactivated successfully.
Dec 11 01:18:16 np0005554845 systemd[1]: var-lib-containers-storage-overlay-abf026504a21dc9c8c3155e767d6e0b8552a3d087a9c4a1633900303e1f54f47-merged.mount: Deactivated successfully.
Dec 11 01:18:16 np0005554845 podman[223678]: 2025-12-11 06:18:16.824677682 +0000 UTC m=+0.098985739 container cleanup 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.832 187132 INFO nova.virt.libvirt.driver [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Instance destroyed successfully.#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.833 187132 DEBUG nova.objects.instance [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lazy-loading 'resources' on Instance uuid 580cacf3-199a-4e28-a146-69ccaa92a8b9 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:18:16 np0005554845 systemd[1]: libpod-conmon-48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8.scope: Deactivated successfully.
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.852 187132 DEBUG nova.virt.libvirt.vif [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:17:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-114010906-access_point-951634044',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-114010906-acc',id=43,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBDFOUpkBQsgNGSTfE8jhsAMU6fxi5Kjw6Z1gMY9XQ8r03QMNlMamVtgRMYVNipMdgs1FMO/ad17xEAQMBEVTnz53Hvp0T8PC2ZHyqlM+ScRPrKadCyZsofGod82ACxvw2w==',key_name='tempest-TestSecurityGroupsBasicOps-1619969181',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:17:48Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='1528e0ec9d214424bcb218bb466f693e',ramdisk_id='',reservation_id='r-zxtpcn89',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-114010906',owner_user_name='tempest-TestSecurityGroupsBasicOps-114010906-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:17:48Z,user_data=None,user_id='9a0a63046d2f4ebd819e7c5bb47d172b',uuid=580cacf3-199a-4e28-a146-69ccaa92a8b9,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.852 187132 DEBUG nova.network.os_vif_util [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converting VIF {"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.239", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.853 187132 DEBUG nova.network.os_vif_util [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.853 187132 DEBUG os_vif [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.854 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.855 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap476816b4-a4, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.856 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.858 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.860 187132 INFO os_vif [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:9b:b7:3e,bridge_name='br-int',has_traffic_filtering=True,id=476816b4-a4da-4bc1-b013-60c522330ded,network=Network(51af992f-1722-4ed3-91ad-881c098813ea),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap476816b4-a4')#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.860 187132 INFO nova.virt.libvirt.driver [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Deleting instance files /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9_del#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.861 187132 INFO nova.virt.libvirt.driver [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Deletion of /var/lib/nova/instances/580cacf3-199a-4e28-a146-69ccaa92a8b9_del complete#033[00m
Dec 11 01:18:16 np0005554845 podman[223725]: 2025-12-11 06:18:16.891599759 +0000 UTC m=+0.044608543 container remove 48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.897 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1bd6b33d-1785-4029-b08b-287e7edf19b2]: (4, ('Thu Dec 11 06:18:16 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea (48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8)\n48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8\nThu Dec 11 06:18:16 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea (48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8)\n48e2e7dd27aa986154802595cc8ba8a9c585358902a8492405c3daa6db071ca8\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.899 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ba828a40-bba6-428c-a565-1fdca992a1bf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.900 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap51af992f-10, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:18:16 np0005554845 kernel: tap51af992f-10: left promiscuous mode
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.902 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.913 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.915 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[00f710c3-2e1d-4466-8c3a-22f14f86e076]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.927 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7f665d06-b777-430d-a9a4-b4dbcad53859]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.928 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b315de7e-06ec-4e8e-a3e1-5051c0e23b9d]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.932 187132 INFO nova.compute.manager [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.933 187132 DEBUG oslo.service.loopingcall [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.933 187132 DEBUG nova.compute.manager [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:18:16 np0005554845 nova_compute[187128]: 2025-12-11 06:18:16.933 187132 DEBUG nova.network.neutron [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.944 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[471aee18-fe27-4f81-9240-820576e4dc8b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 414122, 'reachable_time': 21330, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 223738, 'error': None, 'target': 'ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.946 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-51af992f-1722-4ed3-91ad-881c098813ea deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:18:16 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:16.946 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[801bf4b8-fba7-4902-ab57-050959b5a054]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:16 np0005554845 systemd[1]: run-netns-ovnmeta\x2d51af992f\x2d1722\x2d4ed3\x2d91ad\x2d881c098813ea.mount: Deactivated successfully.
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.868 187132 DEBUG nova.network.neutron [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.895 187132 INFO nova.compute.manager [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Took 0.96 seconds to deallocate network for instance.#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.950 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.951 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.976 187132 DEBUG nova.compute.manager [req-be8a1888-3987-42fb-aee8-5565284ff6e0 req-a85a308c-15b6-45c4-90bc-bff61d57386a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-vif-deleted-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.995 187132 DEBUG nova.network.neutron [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updated VIF entry in instance network info cache for port 476816b4-a4da-4bc1-b013-60c522330ded. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:18:17 np0005554845 nova_compute[187128]: 2025-12-11 06:18:17.996 187132 DEBUG nova.network.neutron [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Updating instance_info_cache with network_info: [{"id": "476816b4-a4da-4bc1-b013-60c522330ded", "address": "fa:16:3e:9b:b7:3e", "network": {"id": "51af992f-1722-4ed3-91ad-881c098813ea", "bridge": "br-int", "label": "tempest-network-smoke--142584539", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.12", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "1528e0ec9d214424bcb218bb466f693e", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap476816b4-a4", "ovs_interfaceid": "476816b4-a4da-4bc1-b013-60c522330ded", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.012 187132 DEBUG nova.compute.provider_tree [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.015 187132 DEBUG oslo_concurrency.lockutils [req-3a27085c-de36-4459-a500-a27a5914e340 req-2ecc7a0a-6c0a-42f4-a4d1-d24cff5ccfbd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-580cacf3-199a-4e28-a146-69ccaa92a8b9" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.027 187132 DEBUG nova.scheduler.client.report [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.052 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.102s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.077 187132 INFO nova.scheduler.client.report [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Deleted allocations for instance 580cacf3-199a-4e28-a146-69ccaa92a8b9#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.139 187132 DEBUG oslo_concurrency.lockutils [None req-af46f7e9-1d2d-4bd4-bafd-fa215f855534 9a0a63046d2f4ebd819e7c5bb47d172b 1528e0ec9d214424bcb218bb466f693e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.573s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.612 187132 DEBUG nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-vif-unplugged-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.613 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.613 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.613 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.614 187132 DEBUG nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] No waiting events found dispatching network-vif-unplugged-476816b4-a4da-4bc1-b013-60c522330ded pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.614 187132 WARNING nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received unexpected event network-vif-unplugged-476816b4-a4da-4bc1-b013-60c522330ded for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.614 187132 DEBUG nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.614 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.615 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.615 187132 DEBUG oslo_concurrency.lockutils [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "580cacf3-199a-4e28-a146-69ccaa92a8b9-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.615 187132 DEBUG nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] No waiting events found dispatching network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:18:18 np0005554845 nova_compute[187128]: 2025-12-11 06:18:18.615 187132 WARNING nova.compute.manager [req-3ed180dd-bf84-4ca1-8470-a1d8961e51e5 req-023831db-8e1b-4dff-be7b-7b151c571ee5 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Received unexpected event network-vif-plugged-476816b4-a4da-4bc1-b013-60c522330ded for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:18:20 np0005554845 nova_compute[187128]: 2025-12-11 06:18:20.832 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:21 np0005554845 nova_compute[187128]: 2025-12-11 06:18:21.516 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:21 np0005554845 nova_compute[187128]: 2025-12-11 06:18:21.856 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:25 np0005554845 nova_compute[187128]: 2025-12-11 06:18:25.833 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:26 np0005554845 podman[223739]: 2025-12-11 06:18:26.11788165 +0000 UTC m=+0.050237666 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:18:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:26.230 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:26.231 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:26.231 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:26 np0005554845 nova_compute[187128]: 2025-12-11 06:18:26.858 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:30 np0005554845 nova_compute[187128]: 2025-12-11 06:18:30.835 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:31 np0005554845 nova_compute[187128]: 2025-12-11 06:18:31.830 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433896.828772, 580cacf3-199a-4e28-a146-69ccaa92a8b9 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:18:31 np0005554845 nova_compute[187128]: 2025-12-11 06:18:31.830 187132 INFO nova.compute.manager [-] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:18:31 np0005554845 nova_compute[187128]: 2025-12-11 06:18:31.851 187132 DEBUG nova.compute.manager [None req-4b037221-2f81-4942-9c03-705a0ff4b394 - - - - - -] [instance: 580cacf3-199a-4e28-a146-69ccaa92a8b9] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:18:31 np0005554845 nova_compute[187128]: 2025-12-11 06:18:31.860 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:32 np0005554845 podman[223763]: 2025-12-11 06:18:32.118121348 +0000 UTC m=+0.053800913 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.714 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.715 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.715 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:18:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:33.790 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:82:57 10.100.0.2 2001:db8::f816:3eff:feb0:8257'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feb0:8257/64', 'neutron:device_id': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39ee0a97-89df-4836-8fd5-1fa735eea42f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46b91b6b-a8b6-4052-b122-b5d780c4ad3f) old=Port_Binding(mac=['fa:16:3e:b0:82:57 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:18:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:33.791 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46b91b6b-a8b6-4052-b122-b5d780c4ad3f in datapath d06dc841-febe-4a7e-b747-ad772083d6d5 updated#033[00m
Dec 11 01:18:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:33.792 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d06dc841-febe-4a7e-b747-ad772083d6d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:18:33 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:33.793 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3cfa54df-d67b-420e-b7d9-1bd9754fedce]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:33 np0005554845 podman[223784]: 2025-12-11 06:18:33.862190587 +0000 UTC m=+0.096218414 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:18:33 np0005554845 podman[223785]: 2025-12-11 06:18:33.872100146 +0000 UTC m=+0.099740619 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.894 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.895 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5676MB free_disk=73.29174423217773GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.895 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.895 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.981 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:18:33 np0005554845 nova_compute[187128]: 2025-12-11 06:18:33.982 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:18:34 np0005554845 nova_compute[187128]: 2025-12-11 06:18:34.000 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:18:34 np0005554845 nova_compute[187128]: 2025-12-11 06:18:34.015 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:18:34 np0005554845 nova_compute[187128]: 2025-12-11 06:18:34.038 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:18:34 np0005554845 nova_compute[187128]: 2025-12-11 06:18:34.039 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.035 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.035 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.035 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.036 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.113 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.690 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:18:35 np0005554845 nova_compute[187128]: 2025-12-11 06:18:35.836 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:36 np0005554845 nova_compute[187128]: 2025-12-11 06:18:36.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:36 np0005554845 nova_compute[187128]: 2025-12-11 06:18:36.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:36 np0005554845 nova_compute[187128]: 2025-12-11 06:18:36.862 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:37 np0005554845 podman[223830]: 2025-12-11 06:18:37.137098674 +0000 UTC m=+0.071367629 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible)
Dec 11 01:18:38 np0005554845 nova_compute[187128]: 2025-12-11 06:18:38.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:39 np0005554845 nova_compute[187128]: 2025-12-11 06:18:39.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:39 np0005554845 nova_compute[187128]: 2025-12-11 06:18:39.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:39 np0005554845 nova_compute[187128]: 2025-12-11 06:18:39.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:18:39 np0005554845 nova_compute[187128]: 2025-12-11 06:18:39.708 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:18:40 np0005554845 nova_compute[187128]: 2025-12-11 06:18:40.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:41 np0005554845 nova_compute[187128]: 2025-12-11 06:18:41.865 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:42 np0005554845 podman[223851]: 2025-12-11 06:18:42.127809501 +0000 UTC m=+0.059346763 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:18:42 np0005554845 podman[223852]: 2025-12-11 06:18:42.130808152 +0000 UTC m=+0.058412257 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, vcs-type=git, maintainer=Red Hat, Inc., managed_by=edpm_ansible, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, release=1755695350, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 01:18:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:42.977 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b0:82:57 10.100.0.2 2001:db8:0:1:f816:3eff:feb0:8257 2001:db8::f816:3eff:feb0:8257'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8:0:1:f816:3eff:feb0:8257/64 2001:db8::f816:3eff:feb0:8257/64', 'neutron:device_id': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39ee0a97-89df-4836-8fd5-1fa735eea42f, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=46b91b6b-a8b6-4052-b122-b5d780c4ad3f) old=Port_Binding(mac=['fa:16:3e:b0:82:57 10.100.0.2 2001:db8::f816:3eff:feb0:8257'], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feb0:8257/64', 'neutron:device_id': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:18:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:42.979 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 46b91b6b-a8b6-4052-b122-b5d780c4ad3f in datapath d06dc841-febe-4a7e-b747-ad772083d6d5 updated#033[00m
Dec 11 01:18:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:42.980 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d06dc841-febe-4a7e-b747-ad772083d6d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:18:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:42.981 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b95396b6-ae14-4c24-9483-ae815924e6fb]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:18:43 np0005554845 nova_compute[187128]: 2025-12-11 06:18:43.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:43 np0005554845 nova_compute[187128]: 2025-12-11 06:18:43.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:18:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:43.779 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=17, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=16) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:18:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:43.780 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:18:43 np0005554845 nova_compute[187128]: 2025-12-11 06:18:43.780 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:18:44.782 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '17'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:18:45 np0005554845 nova_compute[187128]: 2025-12-11 06:18:45.841 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:46 np0005554845 nova_compute[187128]: 2025-12-11 06:18:46.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:18:46 np0005554845 nova_compute[187128]: 2025-12-11 06:18:46.868 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:50 np0005554845 nova_compute[187128]: 2025-12-11 06:18:50.843 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:51 np0005554845 nova_compute[187128]: 2025-12-11 06:18:51.871 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:55 np0005554845 nova_compute[187128]: 2025-12-11 06:18:55.844 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:56 np0005554845 nova_compute[187128]: 2025-12-11 06:18:56.874 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:18:57 np0005554845 podman[223896]: 2025-12-11 06:18:57.12641612 +0000 UTC m=+0.063283519 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:19:00 np0005554845 nova_compute[187128]: 2025-12-11 06:19:00.845 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:01 np0005554845 nova_compute[187128]: 2025-12-11 06:19:01.877 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:02Z|00289|memory_trim|INFO|Detected inactivity (last active 30005 ms ago): trimming memory
Dec 11 01:19:03 np0005554845 podman[223920]: 2025-12-11 06:19:03.144448664 +0000 UTC m=+0.069844408 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:19:04 np0005554845 podman[223940]: 2025-12-11 06:19:04.120178517 +0000 UTC m=+0.047423608 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:19:04 np0005554845 podman[223941]: 2025-12-11 06:19:04.155548988 +0000 UTC m=+0.076741945 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, managed_by=edpm_ansible)
Dec 11 01:19:05 np0005554845 nova_compute[187128]: 2025-12-11 06:19:05.845 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:06 np0005554845 nova_compute[187128]: 2025-12-11 06:19:06.880 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:08 np0005554845 podman[223985]: 2025-12-11 06:19:08.112235328 +0000 UTC m=+0.050066751 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:19:10 np0005554845 nova_compute[187128]: 2025-12-11 06:19:10.847 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:11 np0005554845 nova_compute[187128]: 2025-12-11 06:19:11.882 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:13 np0005554845 podman[224007]: 2025-12-11 06:19:13.135238721 +0000 UTC m=+0.062699083 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:19:13 np0005554845 podman[224008]: 2025-12-11 06:19:13.175661129 +0000 UTC m=+0.088564286 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., build-date=2025-08-20T13:12:41, version=9.6, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, io.buildah.version=1.33.7, release=1755695350, vcs-type=git, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, config_id=edpm, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 01:19:15 np0005554845 nova_compute[187128]: 2025-12-11 06:19:15.848 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:16 np0005554845 nova_compute[187128]: 2025-12-11 06:19:16.884 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.482 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.483 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.484 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.484 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.504 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.508 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.603 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.604 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.611 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.611 187132 INFO nova.compute.claims [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.678 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.731 187132 DEBUG nova.compute.provider_tree [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.745 187132 DEBUG nova.scheduler.client.report [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.784 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.180s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.785 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.788 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.110s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.801 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.801 187132 INFO nova.compute.claims [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.849 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.850 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.879 187132 INFO nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:19:19 np0005554845 nova_compute[187128]: 2025-12-11 06:19:19.908 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.015 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.016 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.017 187132 INFO nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Creating image(s)#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.017 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.018 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.019 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.031 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.058 187132 DEBUG nova.compute.provider_tree [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.081 187132 DEBUG nova.scheduler.client.report [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.104 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.073s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.105 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.106 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.116 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.131 187132 DEBUG nova.policy [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.137 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.349s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.138 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.169 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.169 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.212 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk 1073741824" returned: 0 in 0.043s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.213 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.214 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.217 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.112s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.218 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.238 187132 INFO nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.261 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.272 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.054s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.273 187132 DEBUG nova.virt.disk.api [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Checking if we can resize image /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.274 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.330 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json" returned: 0 in 0.057s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.331 187132 DEBUG nova.virt.disk.api [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Cannot resize image /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.332 187132 DEBUG nova.objects.instance [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'migration_context' on Instance uuid 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.349 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.349 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Ensure instance console log exists: /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.350 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.351 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.351 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.393 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.394 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.395 187132 INFO nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Creating image(s)#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.395 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.395 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.396 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.411 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.475 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.476 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.476 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.488 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.529 187132 DEBUG nova.policy [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.543 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.055s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.543 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.576 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk 1073741824" returned: 0 in 0.033s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.577 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.101s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.578 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.630 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.632 187132 DEBUG nova.virt.disk.api [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.632 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.685 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json" returned: 0 in 0.053s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.686 187132 DEBUG nova.virt.disk.api [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.686 187132 DEBUG nova.objects.instance [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid ce6856f2-bbd2-465e-a1bd-4af8e2f38591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.708 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.709 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Ensure instance console log exists: /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.709 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.710 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.710 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.741 187132 WARNING nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] While synchronizing instance power states, found 2 instances in the database and 0 instances on the hypervisor.#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.741 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Triggering sync for uuid 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.742 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Triggering sync for uuid ce6856f2-bbd2-465e-a1bd-4af8e2f38591 _sync_power_states /usr/lib/python3.9/site-packages/nova/compute/manager.py:10268#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.742 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.743 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:20 np0005554845 nova_compute[187128]: 2025-12-11 06:19:20.849 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:21 np0005554845 nova_compute[187128]: 2025-12-11 06:19:21.137 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Successfully created port: 4130ae9c-75bf-4b86-9a73-77d0424ede65 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:19:21 np0005554845 nova_compute[187128]: 2025-12-11 06:19:21.887 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:21 np0005554845 nova_compute[187128]: 2025-12-11 06:19:21.924 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Successfully created port: e34c30d9-f946-403a-8b26-75716a1be5df _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.223 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Successfully updated port: 4130ae9c-75bf-4b86-9a73-77d0424ede65 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.244 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.244 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquired lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.245 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.325 187132 DEBUG nova.compute.manager [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received event network-changed-4130ae9c-75bf-4b86-9a73-77d0424ede65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.325 187132 DEBUG nova.compute.manager [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Refreshing instance network info cache due to event network-changed-4130ae9c-75bf-4b86-9a73-77d0424ede65. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.326 187132 DEBUG oslo_concurrency.lockutils [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:22 np0005554845 nova_compute[187128]: 2025-12-11 06:19:22.465 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:19:23 np0005554845 nova_compute[187128]: 2025-12-11 06:19:23.587 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Successfully updated port: e34c30d9-f946-403a-8b26-75716a1be5df _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:19:23 np0005554845 nova_compute[187128]: 2025-12-11 06:19:23.611 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:23 np0005554845 nova_compute[187128]: 2025-12-11 06:19:23.612 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:23 np0005554845 nova_compute[187128]: 2025-12-11 06:19:23.612 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:19:24 np0005554845 nova_compute[187128]: 2025-12-11 06:19:24.431 187132 DEBUG nova.compute.manager [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:24 np0005554845 nova_compute[187128]: 2025-12-11 06:19:24.431 187132 DEBUG nova.compute.manager [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing instance network info cache due to event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:19:24 np0005554845 nova_compute[187128]: 2025-12-11 06:19:24.432 187132 DEBUG oslo_concurrency.lockutils [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.010 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.035 187132 DEBUG nova.network.neutron [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updating instance_info_cache with network_info: [{"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.060 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Releasing lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.061 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Instance network_info: |[{"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.061 187132 DEBUG oslo_concurrency.lockutils [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.061 187132 DEBUG nova.network.neutron [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Refreshing network info cache for port 4130ae9c-75bf-4b86-9a73-77d0424ede65 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.064 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Start _get_guest_xml network_info=[{"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.068 187132 WARNING nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.071 187132 DEBUG nova.virt.libvirt.host [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.072 187132 DEBUG nova.virt.libvirt.host [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.075 187132 DEBUG nova.virt.libvirt.host [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.075 187132 DEBUG nova.virt.libvirt.host [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.076 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.076 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.077 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.077 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.077 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.077 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.077 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.078 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.078 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.078 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.078 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.078 187132 DEBUG nova.virt.hardware [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.081 187132 DEBUG nova.virt.libvirt.vif [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=46,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG8JKWy+EWk1Ndcw6fVQ26alNEwWP4gw2+NTnNH3PedrzxDTvbUcFjLRogcSgB2p5+KEQUW/zt4BNLK2292WmZR8aDv5jUsLCWToQeEWQicJwkexSkfuo677WEhwXkhZWQ==',key_name='tempest-TestSecurityGroupsBasicOps-2112506805',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-8hit10ov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:19:19Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.082 187132 DEBUG nova.network.os_vif_util [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.082 187132 DEBUG nova.network.os_vif_util [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.083 187132 DEBUG nova.objects.instance [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'pci_devices' on Instance uuid 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.099 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <uuid>8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230</uuid>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <name>instance-0000002e</name>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918</nova:name>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:19:25</nova:creationTime>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:user uuid="78548cbaea0e406ebb716882c382c954">tempest-TestSecurityGroupsBasicOps-2036320412-project-member</nova:user>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:project uuid="9d8630abd3cd4aef89d0b1af6e62ac93">tempest-TestSecurityGroupsBasicOps-2036320412</nova:project>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        <nova:port uuid="4130ae9c-75bf-4b86-9a73-77d0424ede65">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="serial">8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="uuid">8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.config"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:d9:28:31"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <target dev="tap4130ae9c-75"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/console.log" append="off"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:19:25 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:19:25 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:19:25 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:19:25 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.100 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Preparing to wait for external event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.101 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.101 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.101 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.102 187132 DEBUG nova.virt.libvirt.vif [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=46,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG8JKWy+EWk1Ndcw6fVQ26alNEwWP4gw2+NTnNH3PedrzxDTvbUcFjLRogcSgB2p5+KEQUW/zt4BNLK2292WmZR8aDv5jUsLCWToQeEWQicJwkexSkfuo677WEhwXkhZWQ==',key_name='tempest-TestSecurityGroupsBasicOps-2112506805',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-8hit10ov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:19:19Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.102 187132 DEBUG nova.network.os_vif_util [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.103 187132 DEBUG nova.network.os_vif_util [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.103 187132 DEBUG os_vif [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.103 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.104 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.104 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.108 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.108 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap4130ae9c-75, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.109 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap4130ae9c-75, col_values=(('external_ids', {'iface-id': '4130ae9c-75bf-4b86-9a73-77d0424ede65', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:d9:28:31', 'vm-uuid': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.110 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.1120] manager: (tap4130ae9c-75): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/146)
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.113 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.120 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.121 187132 INFO os_vif [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75')#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.231 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.232 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.232 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No VIF found with MAC fa:16:3e:d9:28:31, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.232 187132 INFO nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Using config drive#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.619 187132 INFO nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Creating config drive at /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.config#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.625 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_eqdo102 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.766 187132 DEBUG oslo_concurrency.processutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp_eqdo102" returned: 0 in 0.141s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:25 np0005554845 kernel: tap4130ae9c-75: entered promiscuous mode
Dec 11 01:19:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:25Z|00290|binding|INFO|Claiming lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 for this chassis.
Dec 11 01:19:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:25Z|00291|binding|INFO|4130ae9c-75bf-4b86-9a73-77d0424ede65: Claiming fa:16:3e:d9:28:31 10.100.0.9
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.8276] manager: (tap4130ae9c-75): new Tun device (/org/freedesktop/NetworkManager/Devices/147)
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.827 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.829 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.839 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.843 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.8443] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/148)
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.8449] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/149)
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.847 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:28:31 10.100.0.9'], port_security=['fa:16:3e:d9:28:31 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '2', 'neutron:security_group_ids': 'b08d38e1-0070-4521-a08b-d01957bc4e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c7d0767-431c-4590-9f2e-0b039556ea48, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4130ae9c-75bf-4b86-9a73-77d0424ede65) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.848 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4130ae9c-75bf-4b86-9a73-77d0424ede65 in datapath 70d1c6fd-8452-4bef-babc-0687c3b7f28f bound to our chassis#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.849 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 70d1c6fd-8452-4bef-babc-0687c3b7f28f#033[00m
Dec 11 01:19:25 np0005554845 systemd-udevd[224101]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.865 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[99c93eed-3284-47d1-8ef2-0988b74503a2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.866 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap70d1c6fd-81 in ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.8677] device (tap4130ae9c-75): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.8682] device (tap4130ae9c-75): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.867 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap70d1c6fd-80 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.868 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[35ca116e-7f59-470c-98d5-efca388b7d85]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 systemd-machined[153381]: New machine qemu-22-instance-0000002e.
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.869 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[f5452168-2d2c-4cc2-bec0-25c088e26334]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.883 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[07f159c0-b495-416b-893e-fb2b5a434253]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.918 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6d05934a-6045-4cbb-a812-43d21f317728]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 systemd[1]: Started Virtual Machine qemu-22-instance-0000002e.
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.950 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[268467f2-c8e8-4ad2-a8e0-539bbcf58134]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 NetworkManager[55529]: <info>  [1765433965.9596] manager: (tap70d1c6fd-80): new Veth device (/org/freedesktop/NetworkManager/Devices/150)
Dec 11 01:19:25 np0005554845 systemd-udevd[224105]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.961 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[dceb9b9b-4616-45a1-a0f3-127bff4d8dbe]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.965 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.980 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:25Z|00292|binding|INFO|Setting lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 ovn-installed in OVS
Dec 11 01:19:25 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:25Z|00293|binding|INFO|Setting lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 up in Southbound
Dec 11 01:19:25 np0005554845 nova_compute[187128]: 2025-12-11 06:19:25.990 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.995 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[085a9ca5-bec5-49dd-9f50-8a0856f2cec8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:25 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:25.998 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[5fef71c0-0850-4ffa-aa56-7ada5243703e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 NetworkManager[55529]: <info>  [1765433966.0166] device (tap70d1c6fd-80): carrier: link connected
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.020 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[600a342e-6598-410d-844a-20b615ada7ea]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.036 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c226874a-f6ca-48a3-a2c4-0897242aeb9a]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70d1c6fd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:66:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424066, 'reachable_time': 28036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224135, 'error': None, 'target': 'ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.060 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[597d9c8d-efd7-493e-bba3-e581c136861c]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fed6:666b'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424066, 'tstamp': 424066}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224136, 'error': None, 'target': 'ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.079 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8a3187e6-9c43-47ef-997c-3479a6bc674c]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap70d1c6fd-81'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:d6:66:6b'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 88], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424066, 'reachable_time': 28036, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224137, 'error': None, 'target': 'ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.125 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[abf1dc59-3ea4-4cdd-b81e-dabdf3b2ba95]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.183 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a8961954-30f6-41f6-97e3-1af4c983c057]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.185 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70d1c6fd-80, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.185 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.185 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap70d1c6fd-80, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.187 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:26 np0005554845 kernel: tap70d1c6fd-80: entered promiscuous mode
Dec 11 01:19:26 np0005554845 NetworkManager[55529]: <info>  [1765433966.1898] manager: (tap70d1c6fd-80): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/151)
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.189 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.190 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap70d1c6fd-80, col_values=(('external_ids', {'iface-id': '040c3387-a80a-4a6f-95e1-64615fb6af8c'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.191 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:26 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:26Z|00294|binding|INFO|Releasing lport 040c3387-a80a-4a6f-95e1-64615fb6af8c from this chassis (sb_readonly=0)
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.204 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.205 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/70d1c6fd-8452-4bef-babc-0687c3b7f28f.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/70d1c6fd-8452-4bef-babc-0687c3b7f28f.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.206 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8c319501-1192-4cf5-b549-8b645413dd65]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.206 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-70d1c6fd-8452-4bef-babc-0687c3b7f28f
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/70d1c6fd-8452-4bef-babc-0687c3b7f28f.pid.haproxy
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 70d1c6fd-8452-4bef-babc-0687c3b7f28f
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.207 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'env', 'PROCESS_TAG=haproxy-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/70d1c6fd-8452-4bef-babc-0687c3b7f28f.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.231 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.232 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:26.232 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.383 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433966.3831236, 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.383 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] VM Started (Lifecycle Event)#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.401 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.405 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433966.3852632, 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.405 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.430 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.435 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:19:26 np0005554845 nova_compute[187128]: 2025-12-11 06:19:26.480 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:26 np0005554845 podman[224176]: 2025-12-11 06:19:26.713293979 +0000 UTC m=+0.110741558 container create a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:19:26 np0005554845 podman[224176]: 2025-12-11 06:19:26.625751791 +0000 UTC m=+0.023199390 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:19:26 np0005554845 systemd[1]: Started libpod-conmon-a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79.scope.
Dec 11 01:19:26 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:19:26 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/c461b9f2376e7106a4becd3538b59684b24222a26782cc97116f8a074e02453c/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:19:26 np0005554845 podman[224176]: 2025-12-11 06:19:26.928228505 +0000 UTC m=+0.325676154 container init a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 01:19:26 np0005554845 podman[224176]: 2025-12-11 06:19:26.93503545 +0000 UTC m=+0.332483079 container start a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 01:19:26 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [NOTICE]   (224195) : New worker (224197) forked
Dec 11 01:19:26 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [NOTICE]   (224195) : Loading success.
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.853 187132 DEBUG nova.compute.manager [req-6f5d7d67-8565-401c-8196-3aa30733e657 req-3f00904d-2f79-4761-b76a-3434987ad271 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.853 187132 DEBUG oslo_concurrency.lockutils [req-6f5d7d67-8565-401c-8196-3aa30733e657 req-3f00904d-2f79-4761-b76a-3434987ad271 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.853 187132 DEBUG oslo_concurrency.lockutils [req-6f5d7d67-8565-401c-8196-3aa30733e657 req-3f00904d-2f79-4761-b76a-3434987ad271 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.854 187132 DEBUG oslo_concurrency.lockutils [req-6f5d7d67-8565-401c-8196-3aa30733e657 req-3f00904d-2f79-4761-b76a-3434987ad271 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.854 187132 DEBUG nova.compute.manager [req-6f5d7d67-8565-401c-8196-3aa30733e657 req-3f00904d-2f79-4761-b76a-3434987ad271 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Processing event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.855 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Instance event wait completed in 1 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.861 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433967.8614452, 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.862 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.871 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.876 187132 INFO nova.virt.libvirt.driver [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Instance spawned successfully.#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.877 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.913 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.919 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.920 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.920 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.921 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.921 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.921 187132 DEBUG nova.virt.libvirt.driver [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.928 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:19:27 np0005554845 nova_compute[187128]: 2025-12-11 06:19:27.972 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.013 187132 INFO nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Took 8.00 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.014 187132 DEBUG nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.099 187132 INFO nova.compute.manager [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Took 8.52 seconds to build instance.#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.120 187132 DEBUG oslo_concurrency.lockutils [None req-aaa3280e-1af3-40bb-92f9-d8697d67125d 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 8.638s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.121 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 7.379s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.121 187132 INFO nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.121 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:28 np0005554845 podman[224206]: 2025-12-11 06:19:28.143993019 +0000 UTC m=+0.069228752 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.951 187132 DEBUG nova.network.neutron [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updated VIF entry in instance network info cache for port 4130ae9c-75bf-4b86-9a73-77d0424ede65. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.952 187132 DEBUG nova.network.neutron [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updating instance_info_cache with network_info: [{"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:28 np0005554845 nova_compute[187128]: 2025-12-11 06:19:28.968 187132 DEBUG oslo_concurrency.lockutils [req-51daa51d-c33c-4c89-a754-f80c694053bb req-f6a045c6-04f8-4871-8350-f6a6a0e92299 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.558 187132 DEBUG nova.network.neutron [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updating instance_info_cache with network_info: [{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.635 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.636 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Instance network_info: |[{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.637 187132 DEBUG oslo_concurrency.lockutils [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.638 187132 DEBUG nova.network.neutron [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.644 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Start _get_guest_xml network_info=[{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.651 187132 WARNING nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.659 187132 DEBUG nova.virt.libvirt.host [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.660 187132 DEBUG nova.virt.libvirt.host [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.667 187132 DEBUG nova.virt.libvirt.host [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.668 187132 DEBUG nova.virt.libvirt.host [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.669 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.669 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.670 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.670 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.670 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.670 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.671 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.671 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.671 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.671 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.672 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.672 187132 DEBUG nova.virt.hardware [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.675 187132 DEBUG nova.virt.libvirt.vif [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2028969009',display_name='tempest-TestGettingAddress-server-2028969009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2028969009',id=47,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEEUaNKC1fg61QRRrAyHgRKiPKfEO7WxkneeQc0tv515uDr9lv/Nkq7yTmkNzmXruBDzaCGCnBqkZmh2CfljQIGqPjTR+62tkla/qpVrvL9f2FwN38U1AwR6cA9d81fr6A==',key_name='tempest-TestGettingAddress-305222553',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-lnegqfzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:19:20Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=ce6856f2-bbd2-465e-a1bd-4af8e2f38591,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.675 187132 DEBUG nova.network.os_vif_util [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.676 187132 DEBUG nova.network.os_vif_util [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.677 187132 DEBUG nova.objects.instance [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid ce6856f2-bbd2-465e-a1bd-4af8e2f38591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.695 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <uuid>ce6856f2-bbd2-465e-a1bd-4af8e2f38591</uuid>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <name>instance-0000002f</name>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-2028969009</nova:name>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:19:29</nova:creationTime>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        <nova:port uuid="e34c30d9-f946-403a-8b26-75716a1be5df">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.9" ipVersion="4"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8:0:1:f816:3eff:fe06:8b82" ipVersion="6"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe06:8b82" ipVersion="6"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="serial">ce6856f2-bbd2-465e-a1bd-4af8e2f38591</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="uuid">ce6856f2-bbd2-465e-a1bd-4af8e2f38591</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.config"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:06:8b:82"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <target dev="tape34c30d9-f9"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/console.log" append="off"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:19:29 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:19:29 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:19:29 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:19:29 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.696 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Preparing to wait for external event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.696 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.697 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.697 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.697 187132 DEBUG nova.virt.libvirt.vif [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2028969009',display_name='tempest-TestGettingAddress-server-2028969009',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2028969009',id=47,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEEUaNKC1fg61QRRrAyHgRKiPKfEO7WxkneeQc0tv515uDr9lv/Nkq7yTmkNzmXruBDzaCGCnBqkZmh2CfljQIGqPjTR+62tkla/qpVrvL9f2FwN38U1AwR6cA9d81fr6A==',key_name='tempest-TestGettingAddress-305222553',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-lnegqfzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:19:20Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=ce6856f2-bbd2-465e-a1bd-4af8e2f38591,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.698 187132 DEBUG nova.network.os_vif_util [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.698 187132 DEBUG nova.network.os_vif_util [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.698 187132 DEBUG os_vif [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.699 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.699 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.700 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.704 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.705 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape34c30d9-f9, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.705 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape34c30d9-f9, col_values=(('external_ids', {'iface-id': 'e34c30d9-f946-403a-8b26-75716a1be5df', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:06:8b:82', 'vm-uuid': 'ce6856f2-bbd2-465e-a1bd-4af8e2f38591'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.707 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:29 np0005554845 NetworkManager[55529]: <info>  [1765433969.7085] manager: (tape34c30d9-f9): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/152)
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.710 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.715 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.717 187132 INFO os_vif [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9')#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.812 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.812 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.813 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:06:8b:82, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.813 187132 INFO nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Using config drive#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.944 187132 DEBUG nova.compute.manager [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.945 187132 DEBUG oslo_concurrency.lockutils [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.945 187132 DEBUG oslo_concurrency.lockutils [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.945 187132 DEBUG oslo_concurrency.lockutils [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.946 187132 DEBUG nova.compute.manager [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] No waiting events found dispatching network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:19:29 np0005554845 nova_compute[187128]: 2025-12-11 06:19:29.946 187132 WARNING nova.compute.manager [req-15dd425b-52aa-498a-9d9d-5cb5a88662ca req-196f3e2e-9681-46c3-8241-96166438f432 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received unexpected event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.104 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': 'ce6856f2-bbd2-465e-a1bd-4af8e2f38591', 'name': 'tempest-TestGettingAddress-server-2028969009', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002f', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'shutdown', 'tenant_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'user_id': '60e9372de4754580913a836e11b9c248', 'hostId': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'status': 'stopped', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.107 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-0000002e', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'user_id': '78548cbaea0e406ebb716882c382c954', 'hostId': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.107 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.109 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.incoming.bytes.delta: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.112 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 / tap4130ae9c-75 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.112 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '7e9d27d3-98f5-48ac-ab7c-d63e5b5b4df2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.107934', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a228306-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': '873bfacfb5fd7d115d35aa2e8624373479a3b0e0d51e8f9a4558466375f1d0f4'}]}, 'timestamp': '2025-12-11 06:19:30.113229', '_unique_id': 'd20cd27a42a549e581b7759e5de2d41c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.114 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.115 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.116 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of cpu: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.138 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/cpu volume: 2210000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '23b8cecb-800f-4c64-869d-bbdbbbbe1c64', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 2210000000, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'timestamp': '2025-12-11T06:19:30.115735', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': '5a2665fc-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.839685877, 'message_signature': '31adb1ed597c82dcbcfa6509f08c284754f7373327cebffa71dc5c90b284261d'}]}, 'timestamp': '2025-12-11 06:19:30.138691', '_unique_id': '78428f0ee51a45329d6ba745d0f92f95'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.139 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.140 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.141 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.read.latency: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.178 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.latency volume: 143348231 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.178 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.latency volume: 710429 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd4346a26-c3ac-418a-939b-0055e4910b5c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 143348231, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.140807', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a2c8428-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': 'dbdec5f67317148421f70886d954f75ee9504b5111bf78dfd20f8bc48b60d724'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 710429, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.140807', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a2c8fae-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': 'd142a8376e0bb3504d79ec77e0783c313b1e8befbae2c639ebdbab6ce829f2c6'}]}, 'timestamp': '2025-12-11 06:19:30.178978', '_unique_id': 'b4106011f3024175af4517ae010a7d70'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.180 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>]
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>]
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.181 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.182 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.read.requests: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.182 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.182 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '97bc4c04-d795-4d0c-8460-2e85fc9d1079', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.181869', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a2d2c48-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '04951a6606d8d1778780174f48763291035b18ef77c2bcc72f54920c2e4ebb72'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.181869', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a2d34b8-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': 'ad50c3ab5d4725b6c0299057aece551a1cefc68d8a3ea4bdc5b63324f2886cd5'}]}, 'timestamp': '2025-12-11 06:19:30.183202', '_unique_id': '6aaa6a5c8a9e4d30b1da5e013b61424e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.183 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.184 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.185 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.outgoing.packets.drop: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.185 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '0defe84a-d675-46de-9fb8-b0e785b9735c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.184702', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a2d94d0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': 'e627fbda7d085132feb57172866287df4d83e899ccff0331496b37ce82c1e38a'}]}, 'timestamp': '2025-12-11 06:19:30.185677', '_unique_id': '57f92beaa1c6431fb4026b5bd3b5cd12'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.186 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>]
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.187 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.188 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.write.latency: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.188 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.188 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fb446980-3c6d-46aa-91e8-a4804b5dacb6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.187114', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a2dfe2a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '6a924efa938a5e9bc57f128125de68e88f01701cef5a5c78bf6eb189ddc35396'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.187114', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a2e0e06-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '57686ffc6889733e6cc942b51d6cbe9d656c9601b21c5280e9db9e8a1d9003f0'}]}, 'timestamp': '2025-12-11 06:19:30.188743', '_unique_id': 'b0b30202dcc94b20aa4dcb62de60cc5f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.189 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.190 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.outgoing.packets: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.190 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5ad7fadc-8591-43d9-b0b9-5c5217ba6111', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.189881', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a2e592e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': 'b5e2770fee2bba808e502e65bbc26e3e236f3d2f421c3a765e04fddcbdbe919c'}]}, 'timestamp': '2025-12-11 06:19:30.190679', '_unique_id': 'f4c765692e604d618be91a5e76396f9f'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.191 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.192 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.outgoing.packets.error: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.192 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6cb6eb8a-d92a-4e43-9993-a5f5f73c41ac', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.191800', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a2eadb6-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': 'd39e3bdc03828d8701ff9625a3487f5409f6111b0f5073e00ff6d59c608b37b6'}]}, 'timestamp': '2025-12-11 06:19:30.192848', '_unique_id': '376c1db7a56e49cea7268370eb894a2b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.193 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.194 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.usage: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.206 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.207 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ea42a87a-9ead-4eee-a433-06199da98a56', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.193971', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a30e428-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': '4aebf1f50c4d747b8072f668039e005a7ecc37fc562ad38136c74349dd2eff63'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.193971', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a30f0b2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': 'eeaa4050d341a58bb86f995578a21f975f62565469d5a2a949c9243eaa5132c6'}]}, 'timestamp': '2025-12-11 06:19:30.207666', '_unique_id': '12aaa092973b492e91b95b4c9e11de0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.209 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.210 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.write.bytes: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.210 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.210 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'bc3ab4c9-1d62-4050-84bb-63db595eee99', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.209861', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a316f6a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '73de8b294bb21885f122c406a76aa427e018939326efa21fbb4d9ec27a3ca632'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.209861', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a31799c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': 'be1bc22a8da60ecbe9ff9fe96fc9f1b3777057f375b975585a277f5abdebd5b0'}]}, 'timestamp': '2025-12-11 06:19:30.211194', '_unique_id': 'd7feb63f872341d3b5dc72f4b94974c3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.211 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.212 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.213 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.allocation: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.213 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.213 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '88abee4e-808c-4b55-aac6-71e6888d93a8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.212595', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a31cfe6-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': '789d596a53aaaf9f153eba7aa21faf9aa2860ebcae02dc44ef96fcc220a76cfb'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.212595', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a31db80-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': 'a6721b8a1306cad87501f1b94ea6cbcd01134e37ad09b2f353042c08a46aa4e2'}]}, 'timestamp': '2025-12-11 06:19:30.213702', '_unique_id': '8f7a179681fc4490b6611175bf65b90c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.214 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.215 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.incoming.bytes: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.215 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d00ab0c-b292-409a-b9eb-1e4246438e51', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.215048', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a3231f2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': '08eb69473314a5b1f5b3289205bcb59e8438c86c2c3c3acb09344395d78421ba'}]}, 'timestamp': '2025-12-11 06:19:30.215909', '_unique_id': '01667acaf551424db26ee752bd6d4a0e'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.216 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.217 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.outgoing.bytes: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.217 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '343e9543-b954-4181-850c-41dd6a6e8185', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.216994', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a327888-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': '44c44335d3fe8bfd7b9d10ae545b3e4fb4f660dcb0578d25571a5195a29120c9'}]}, 'timestamp': '2025-12-11 06:19:30.217697', '_unique_id': 'da1125136182467d9c3737de0caaf071'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.218 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.219 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.capacity: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.219 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.219 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '304c0237-1b9d-4766-89c6-9c9b46e8d10c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.218877', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a32c572-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': '127e6eb67eeea73a1d7b2b52ca0f9f882ed2c509e48b65e84dd6bc61f856cafa'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.218877', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a32ce78-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.896222742, 'message_signature': '26d46aba99894fd0d9fff233873643e6f9143c7ca0ed44ce02aa24e4dd7e9c92'}]}, 'timestamp': '2025-12-11 06:19:30.219891', '_unique_id': '844bc75d1883490bab1afe5ab9de74e1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.220 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.221 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.221 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.incoming.packets: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.221 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b19daec7-a98d-4ddf-b69f-770827ed242b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.221277', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a332242-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': 'd508cd02f76eade87acfa3cd86850982a133cd4617ea691957633e07ec56b27d'}]}, 'timestamp': '2025-12-11 06:19:30.222042', '_unique_id': '9d15db48453049aeb5d9cff7133b4d0b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.222 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.223 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.223 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of memory.usage: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.223 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.223 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230: ceilometer.compute.pollsters.NoVolumeException
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.224 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.write.requests: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.224 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.224 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1d12f426-4719-47c4-983e-e8085051876c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.223784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a338598-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '5da500501b4a2337f49669e161e022509edd734c778a77053b88c72ee836b12a'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.223784', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a338d9a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '78280b50fe46e8c97aac6aeb1e7391c030feea1c4d0274b830bc536c7306cb1d'}]}, 'timestamp': '2025-12-11 06:19:30.224772', '_unique_id': 'eee521346fde40b6a8b098faadb256c5'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.225 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.226 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of disk.device.read.bytes: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.226 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.226 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '78259213-0ddd-4a6c-a8a8-6db0eb51e8b0', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-vda', 'timestamp': '2025-12-11T06:19:30.225956', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': '5a33da8e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': 'd70af83cf1da40e1d43eb96902fdb260362b47382af56ceae7ecc6fae647ca37'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-sda', 'timestamp': '2025-12-11T06:19:30.225956', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'instance-0000002e', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': '5a33e42a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.843341955, 'message_signature': '5779fafcd89b15becdde1095c6c50db9e8c41f5ec6d4bb3cce1131fd732487be'}]}, 'timestamp': '2025-12-11 06:19:30.227003', '_unique_id': '3fe510801d794f55bac7408cf3f229ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.227 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.228 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.228 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.outgoing.bytes.delta: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1f48d7cf-8b2f-4ac3-afbc-a06e0256fea9', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.228337', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a343d62-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': '2e8002ee10e107a308e49254572020b3cdb99316e85be8556af13d320d801d9b'}]}, 'timestamp': '2025-12-11 06:19:30.229315', '_unique_id': 'ef611ed2c9ed4cd28830db32e747083d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.230 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.231 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.incoming.packets.drop: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.231 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'c1774c84-60e9-4f82-8edc-304686c6ad0b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.230594', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a3497da-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': 'ee4b4a0f822c132d0156996ed2560066af2a8e294f80f4a37b6c171285d00b59'}]}, 'timestamp': '2025-12-11 06:19:30.231612', '_unique_id': '05c61bdc26bf4f3f9cb306114e27e353'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.232 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.233 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.233 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-2028969009>, <NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918>]
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.233 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.233 12 DEBUG ceilometer.compute.pollsters [-] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 was shut off while getting sample of network.incoming.packets.error: Failed to inspect data of instance <name=instance-0000002f, id=ce6856f2-bbd2-465e-a1bd-4af8e2f38591>, domain state is SHUTOFF. get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:152
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 DEBUG ceilometer.compute.pollsters [-] 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '46f848a6-0841-446d-8da4-65abfd9d4419', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-0000002e-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-tap4130ae9c-75', 'timestamp': '2025-12-11T06:19:30.233427', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918', 'name': 'tap4130ae9c-75', 'instance_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:d9:28:31', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap4130ae9c-75'}, 'message_id': '5a350148-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4244.811501792, 'message_signature': '855304d8669da7e0a83738bcf2e876b65f8f372e352940b9d3ff04f0f5b77860'}]}, 'timestamp': '2025-12-11 06:19:30.234319', '_unique_id': 'b27dd76116f74dcd8936a40f73795dbf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:19:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:19:30.234 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:19:30 np0005554845 nova_compute[187128]: 2025-12-11 06:19:30.757 187132 INFO nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Creating config drive at /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.config#033[00m
Dec 11 01:19:30 np0005554845 nova_compute[187128]: 2025-12-11 06:19:30.763 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0genv03 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:30 np0005554845 nova_compute[187128]: 2025-12-11 06:19:30.885 187132 DEBUG oslo_concurrency.processutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpf0genv03" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:30 np0005554845 kernel: tape34c30d9-f9: entered promiscuous mode
Dec 11 01:19:30 np0005554845 NetworkManager[55529]: <info>  [1765433970.9531] manager: (tape34c30d9-f9): new Tun device (/org/freedesktop/NetworkManager/Devices/153)
Dec 11 01:19:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:31Z|00295|binding|INFO|Claiming lport e34c30d9-f946-403a-8b26-75716a1be5df for this chassis.
Dec 11 01:19:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:31Z|00296|binding|INFO|e34c30d9-f946-403a-8b26-75716a1be5df: Claiming fa:16:3e:06:8b:82 10.100.0.9 2001:db8:0:1:f816:3eff:fe06:8b82 2001:db8::f816:3eff:fe06:8b82
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.013 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:31Z|00297|binding|INFO|Setting lport e34c30d9-f946-403a-8b26-75716a1be5df ovn-installed in OVS
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.029 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.033 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 systemd-udevd[224253]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:19:31 np0005554845 systemd-machined[153381]: New machine qemu-23-instance-0000002f.
Dec 11 01:19:31 np0005554845 NetworkManager[55529]: <info>  [1765433971.0672] device (tape34c30d9-f9): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:19:31 np0005554845 systemd[1]: Started Virtual Machine qemu-23-instance-0000002f.
Dec 11 01:19:31 np0005554845 NetworkManager[55529]: <info>  [1765433971.0679] device (tape34c30d9-f9): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:19:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:31Z|00298|binding|INFO|Setting lport e34c30d9-f946-403a-8b26-75716a1be5df up in Southbound
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.098 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:8b:82 10.100.0.9 2001:db8:0:1:f816:3eff:fe06:8b82 2001:db8::f816:3eff:fe06:8b82'], port_security=['fa:16:3e:06:8b:82 10.100.0.9 2001:db8:0:1:f816:3eff:fe06:8b82 2001:db8::f816:3eff:fe06:8b82'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe06:8b82/64 2001:db8::f816:3eff:fe06:8b82/64', 'neutron:device_id': 'ce6856f2-bbd2-465e-a1bd-4af8e2f38591', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '1338b0a4-34fa-4fa1-a3dd-77213283cd6e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39ee0a97-89df-4836-8fd5-1fa735eea42f, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=e34c30d9-f946-403a-8b26-75716a1be5df) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.099 104320 INFO neutron.agent.ovn.metadata.agent [-] Port e34c30d9-f946-403a-8b26-75716a1be5df in datapath d06dc841-febe-4a7e-b747-ad772083d6d5 bound to our chassis#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.101 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network d06dc841-febe-4a7e-b747-ad772083d6d5#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.112 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[39bd4b64-5896-47c4-980b-cde90072ba4a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.112 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapd06dc841-f1 in ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.114 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapd06dc841-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.114 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ec3f065a-f386-408c-91e0-ec85d2e01df3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.115 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2947a7ce-03f1-48a9-97d7-ef3052365999]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.129 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[15ab3b0b-bb84-4b3a-9bb8-6513d808ed6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.142 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6a0a0db3-61b0-4d77-865b-d4efe23e217d]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.170 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[bbc37a78-405e-473e-b7d4-e175ccb1a257]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.179 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ae1f8270-92fa-4efe-9da4-584d3748edcb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 NetworkManager[55529]: <info>  [1765433971.1800] manager: (tapd06dc841-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/154)
Dec 11 01:19:31 np0005554845 systemd-udevd[224255]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.218 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[63b13341-b18c-46a3-a38a-3964b6c80de5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.221 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[489694f3-1fad-465f-b493-c8938106a02f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 NetworkManager[55529]: <info>  [1765433971.2418] device (tapd06dc841-f0): carrier: link connected
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.245 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[bb5e35f4-262c-4e1b-936d-50c9996084c8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.261 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cf8e2acd-1056-435c-bcbe-11774c001db6]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd06dc841-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:82:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424588, 'reachable_time': 41854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224289, 'error': None, 'target': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.278 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[427b0034-6406-417d-9ebd-9174007c69c4]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb0:8257'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 424588, 'tstamp': 424588}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 224290, 'error': None, 'target': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.295 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2821296a-7a85-4007-a4bb-e4ef3aae6556]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapd06dc841-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b0:82:57'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 90], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424588, 'reachable_time': 41854, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 224291, 'error': None, 'target': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.336 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[90485e2f-a38a-44e8-aa0c-b245a4c7f9c1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.423 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b5f3f2ea-6f87-4d3c-96a6-b5d46d5f37cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.426 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd06dc841-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.426 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.428 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapd06dc841-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:31 np0005554845 NetworkManager[55529]: <info>  [1765433971.4312] manager: (tapd06dc841-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/155)
Dec 11 01:19:31 np0005554845 kernel: tapd06dc841-f0: entered promiscuous mode
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.430 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.436 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapd06dc841-f0, col_values=(('external_ids', {'iface-id': '46b91b6b-a8b6-4052-b122-b5d780c4ad3f'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:31Z|00299|binding|INFO|Releasing lport 46b91b6b-a8b6-4052-b122-b5d780c4ad3f from this chassis (sb_readonly=0)
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.438 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.442 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/d06dc841-febe-4a7e-b747-ad772083d6d5.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/d06dc841-febe-4a7e-b747-ad772083d6d5.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.443 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[51ba6861-cb69-4254-83b8-3c4a2c627d6f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.446 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-d06dc841-febe-4a7e-b747-ad772083d6d5
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/d06dc841-febe-4a7e-b747-ad772083d6d5.pid.haproxy
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID d06dc841-febe-4a7e-b747-ad772083d6d5
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:19:31 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:31.447 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'env', 'PROCESS_TAG=haproxy-d06dc841-febe-4a7e-b747-ad772083d6d5', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/d06dc841-febe-4a7e-b747-ad772083d6d5.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.450 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.495 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433971.49453, ce6856f2-bbd2-465e-a1bd-4af8e2f38591 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.495 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] VM Started (Lifecycle Event)#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.516 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.520 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433971.4947197, ce6856f2-bbd2-465e-a1bd-4af8e2f38591 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.521 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.538 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.541 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:19:31 np0005554845 nova_compute[187128]: 2025-12-11 06:19:31.563 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:31 np0005554845 podman[224330]: 2025-12-11 06:19:31.83233096 +0000 UTC m=+0.050996206 container create 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:19:31 np0005554845 systemd[1]: Started libpod-conmon-87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a.scope.
Dec 11 01:19:31 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:19:31 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/415bd6cb559c559d4a5b7384c5df687937c2927b93874dd2081fdc3dbf610749/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:19:31 np0005554845 podman[224330]: 2025-12-11 06:19:31.808271427 +0000 UTC m=+0.026936693 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:19:31 np0005554845 podman[224330]: 2025-12-11 06:19:31.914765609 +0000 UTC m=+0.133430875 container init 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:19:31 np0005554845 podman[224330]: 2025-12-11 06:19:31.92216871 +0000 UTC m=+0.140833956 container start 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:19:31 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [NOTICE]   (224349) : New worker (224351) forked
Dec 11 01:19:31 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [NOTICE]   (224349) : Loading success.
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.153 187132 DEBUG nova.compute.manager [req-d32195f1-79a7-4aaf-8030-2f96c8ae2b09 req-69221e5f-67f5-4a8b-a5b8-3a9d6c871e6e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.154 187132 DEBUG oslo_concurrency.lockutils [req-d32195f1-79a7-4aaf-8030-2f96c8ae2b09 req-69221e5f-67f5-4a8b-a5b8-3a9d6c871e6e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.156 187132 DEBUG oslo_concurrency.lockutils [req-d32195f1-79a7-4aaf-8030-2f96c8ae2b09 req-69221e5f-67f5-4a8b-a5b8-3a9d6c871e6e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.002s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.157 187132 DEBUG oslo_concurrency.lockutils [req-d32195f1-79a7-4aaf-8030-2f96c8ae2b09 req-69221e5f-67f5-4a8b-a5b8-3a9d6c871e6e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.157 187132 DEBUG nova.compute.manager [req-d32195f1-79a7-4aaf-8030-2f96c8ae2b09 req-69221e5f-67f5-4a8b-a5b8-3a9d6c871e6e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Processing event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.158 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.162 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765433972.1621523, ce6856f2-bbd2-465e-a1bd-4af8e2f38591 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.163 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.165 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.168 187132 INFO nova.virt.libvirt.driver [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Instance spawned successfully.#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.168 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.186 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.191 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.203 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.204 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.205 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.205 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.206 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.206 187132 DEBUG nova.virt.libvirt.driver [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.216 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.276 187132 INFO nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Took 11.88 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.277 187132 DEBUG nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.356 187132 INFO nova.compute.manager [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Took 12.79 seconds to build instance.#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.411 187132 DEBUG oslo_concurrency.lockutils [None req-37ebcf54-6262-477e-95a5-68ec0c7f4a58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 12.927s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.411 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" acquired by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: waited 11.669s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.412 187132 INFO nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:19:32 np0005554845 nova_compute[187128]: 2025-12-11 06:19:32.412 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" "released" by "nova.compute.manager.ComputeManager._sync_power_states.<locals>._sync.<locals>.query_driver_power_state_and_sync" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.079 187132 DEBUG nova.network.neutron [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updated VIF entry in instance network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.080 187132 DEBUG nova.network.neutron [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updating instance_info_cache with network_info: [{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.108 187132 DEBUG oslo_concurrency.lockutils [req-e0ccaed9-86ac-4bfb-9cfa-ff2ed716de39 req-2332fea8-e3a2-42b0-b655-ffa7874372ce eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.787 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.788 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.788 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.789 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.870 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.935 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:33 np0005554845 podman[224360]: 2025-12-11 06:19:33.938492811 +0000 UTC m=+0.106221116 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:19:33 np0005554845 nova_compute[187128]: 2025-12-11 06:19:33.941 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.004 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.011 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.074 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json" returned: 0 in 0.063s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.075 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.136 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230/disk --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.253 187132 DEBUG nova.compute.manager [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.253 187132 DEBUG oslo_concurrency.lockutils [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.254 187132 DEBUG oslo_concurrency.lockutils [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.254 187132 DEBUG oslo_concurrency.lockutils [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.254 187132 DEBUG nova.compute.manager [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] No waiting events found dispatching network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.255 187132 WARNING nova.compute.manager [req-05583b87-e611-48d1-9191-c62e830c1d0c req-997b045a-d069-460c-a81b-3bb08d8fe7e6 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received unexpected event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df for instance with vm_state active and task_state None.#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.326 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.327 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5355MB free_disk=73.29003143310547GB free_vcpus=6 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.328 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.328 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.444 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.445 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.445 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 2 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.445 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=768MB phys_disk=79GB used_disk=2GB total_vcpus=8 used_vcpus=2 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.501 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.532 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.574 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.575 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.247s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:34 np0005554845 nova_compute[187128]: 2025-12-11 06:19:34.719 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:35 np0005554845 podman[224392]: 2025-12-11 06:19:35.128418682 +0000 UTC m=+0.054552673 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:19:35 np0005554845 podman[224393]: 2025-12-11 06:19:35.161089139 +0000 UTC m=+0.087937900 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, org.label-schema.build-date=20251202, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:19:35 np0005554845 nova_compute[187128]: 2025-12-11 06:19:35.571 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:35 np0005554845 nova_compute[187128]: 2025-12-11 06:19:35.572 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.032 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.927 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.928 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.928 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:19:36 np0005554845 nova_compute[187128]: 2025-12-11 06:19:36.928 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:38 np0005554845 nova_compute[187128]: 2025-12-11 06:19:38.210 187132 DEBUG nova.compute.manager [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:38 np0005554845 nova_compute[187128]: 2025-12-11 06:19:38.211 187132 DEBUG nova.compute.manager [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing instance network info cache due to event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:19:38 np0005554845 nova_compute[187128]: 2025-12-11 06:19:38.212 187132 DEBUG oslo_concurrency.lockutils [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:38 np0005554845 nova_compute[187128]: 2025-12-11 06:19:38.212 187132 DEBUG oslo_concurrency.lockutils [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:38 np0005554845 nova_compute[187128]: 2025-12-11 06:19:38.213 187132 DEBUG nova.network.neutron [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:19:39 np0005554845 podman[224432]: 2025-12-11 06:19:39.139461137 +0000 UTC m=+0.063644109 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, config_id=multipathd, io.buildah.version=1.41.3, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.334 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updating instance_info_cache with network_info: [{"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.361 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.361 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.362 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.363 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.363 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:39 np0005554845 nova_compute[187128]: 2025-12-11 06:19:39.724 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:39Z|00040|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:d9:28:31 10.100.0.9
Dec 11 01:19:39 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:39Z|00041|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:d9:28:31 10.100.0.9
Dec 11 01:19:40 np0005554845 nova_compute[187128]: 2025-12-11 06:19:40.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:19:40 np0005554845 nova_compute[187128]: 2025-12-11 06:19:40.733 187132 DEBUG nova.network.neutron [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updated VIF entry in instance network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:19:40 np0005554845 nova_compute[187128]: 2025-12-11 06:19:40.734 187132 DEBUG nova.network.neutron [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updating instance_info_cache with network_info: [{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:40 np0005554845 nova_compute[187128]: 2025-12-11 06:19:40.763 187132 DEBUG oslo_concurrency.lockutils [req-12a33abc-ee90-49db-a900-1b159c88a6db req-2fa2174a-3f00-4609-8682-8801b2e3eac2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:19:41 np0005554845 nova_compute[187128]: 2025-12-11 06:19:41.035 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:43Z|00042|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:06:8b:82 10.100.0.9
Dec 11 01:19:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:43Z|00043|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:06:8b:82 10.100.0.9
Dec 11 01:19:44 np0005554845 podman[224486]: 2025-12-11 06:19:44.127289236 +0000 UTC m=+0.059702813 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, release=1755695350, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, vcs-type=git, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., managed_by=edpm_ansible)
Dec 11 01:19:44 np0005554845 podman[224485]: 2025-12-11 06:19:44.152460799 +0000 UTC m=+0.085702348 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:19:44 np0005554845 nova_compute[187128]: 2025-12-11 06:19:44.728 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.773 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.773 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.774 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.774 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.774 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.776 187132 INFO nova.compute.manager [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Terminating instance#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.777 187132 DEBUG nova.compute.manager [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:19:45 np0005554845 kernel: tap4130ae9c-75 (unregistering): left promiscuous mode
Dec 11 01:19:45 np0005554845 NetworkManager[55529]: <info>  [1765433985.8058] device (tap4130ae9c-75): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:19:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:45Z|00300|binding|INFO|Releasing lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 from this chassis (sb_readonly=0)
Dec 11 01:19:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:45Z|00301|binding|INFO|Setting lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 down in Southbound
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.811 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:45Z|00302|binding|INFO|Removing iface tap4130ae9c-75 ovn-installed in OVS
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.814 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:45.819 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:28:31 10.100.0.9'], port_security=['fa:16:3e:d9:28:31 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b08d38e1-0070-4521-a08b-d01957bc4e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c7d0767-431c-4590-9f2e-0b039556ea48, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4130ae9c-75bf-4b86-9a73-77d0424ede65) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:45.820 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4130ae9c-75bf-4b86-9a73-77d0424ede65 in datapath 70d1c6fd-8452-4bef-babc-0687c3b7f28f unbound from our chassis#033[00m
Dec 11 01:19:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:45.822 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70d1c6fd-8452-4bef-babc-0687c3b7f28f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:19:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:45.823 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ad904c70-3363-485c-823a-cfee0399f4ae]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:45 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:45.824 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f namespace which is not needed anymore#033[00m
Dec 11 01:19:45 np0005554845 nova_compute[187128]: 2025-12-11 06:19:45.830 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:45 np0005554845 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002e.scope: Deactivated successfully.
Dec 11 01:19:45 np0005554845 systemd[1]: machine-qemu\x2d22\x2dinstance\x2d0000002e.scope: Consumed 13.458s CPU time.
Dec 11 01:19:45 np0005554845 systemd-machined[153381]: Machine qemu-22-instance-0000002e terminated.
Dec 11 01:19:45 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [NOTICE]   (224195) : haproxy version is 2.8.14-c23fe91
Dec 11 01:19:45 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [NOTICE]   (224195) : path to executable is /usr/sbin/haproxy
Dec 11 01:19:45 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [WARNING]  (224195) : Exiting Master process...
Dec 11 01:19:45 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [ALERT]    (224195) : Current worker (224197) exited with code 143 (Terminated)
Dec 11 01:19:45 np0005554845 neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f[224191]: [WARNING]  (224195) : All workers exited. Exiting... (0)
Dec 11 01:19:45 np0005554845 systemd[1]: libpod-a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79.scope: Deactivated successfully.
Dec 11 01:19:45 np0005554845 kernel: tap4130ae9c-75: entered promiscuous mode
Dec 11 01:19:45 np0005554845 podman[224552]: 2025-12-11 06:19:45.995010972 +0000 UTC m=+0.070741423 container died a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:19:45 np0005554845 NetworkManager[55529]: <info>  [1765433985.9951] manager: (tap4130ae9c-75): new Tun device (/org/freedesktop/NetworkManager/Devices/156)
Dec 11 01:19:45 np0005554845 kernel: tap4130ae9c-75 (unregistering): left promiscuous mode
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.001 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:46Z|00303|binding|INFO|Claiming lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 for this chassis.
Dec 11 01:19:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:46Z|00304|binding|INFO|4130ae9c-75bf-4b86-9a73-77d0424ede65: Claiming fa:16:3e:d9:28:31 10.100.0.9
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.012 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:28:31 10.100.0.9'], port_security=['fa:16:3e:d9:28:31 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b08d38e1-0070-4521-a08b-d01957bc4e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c7d0767-431c-4590-9f2e-0b039556ea48, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4130ae9c-75bf-4b86-9a73-77d0424ede65) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.019 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:46Z|00305|binding|INFO|Releasing lport 4130ae9c-75bf-4b86-9a73-77d0424ede65 from this chassis (sb_readonly=0)
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.029 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:d9:28:31 10.100.0.9'], port_security=['fa:16:3e:d9:28:31 10.100.0.9'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28', 'neutron:device_id': '8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '4', 'neutron:security_group_ids': 'b08d38e1-0070-4521-a08b-d01957bc4e8b', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=2c7d0767-431c-4590-9f2e-0b039556ea48, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=4130ae9c-75bf-4b86-9a73-77d0424ede65) old=Port_Binding(chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.037 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79-userdata-shm.mount: Deactivated successfully.
Dec 11 01:19:46 np0005554845 systemd[1]: var-lib-containers-storage-overlay-c461b9f2376e7106a4becd3538b59684b24222a26782cc97116f8a074e02453c-merged.mount: Deactivated successfully.
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.054 187132 INFO nova.virt.libvirt.driver [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Instance destroyed successfully.#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.055 187132 DEBUG nova.objects.instance [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'resources' on Instance uuid 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:46 np0005554845 podman[224552]: 2025-12-11 06:19:46.059718818 +0000 UTC m=+0.135449259 container cleanup a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:19:46 np0005554845 systemd[1]: libpod-conmon-a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79.scope: Deactivated successfully.
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.069 187132 DEBUG nova.virt.libvirt.vif [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-0-1523114918',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=46,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBG8JKWy+EWk1Ndcw6fVQ26alNEwWP4gw2+NTnNH3PedrzxDTvbUcFjLRogcSgB2p5+KEQUW/zt4BNLK2292WmZR8aDv5jUsLCWToQeEWQicJwkexSkfuo677WEhwXkhZWQ==',key_name='tempest-TestSecurityGroupsBasicOps-2112506805',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:19:28Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-8hit10ov',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:19:28Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.070 187132 DEBUG nova.network.os_vif_util [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "address": "fa:16:3e:d9:28:31", "network": {"id": "70d1c6fd-8452-4bef-babc-0687c3b7f28f", "bridge": "br-int", "label": "tempest-network-smoke--1735160449", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap4130ae9c-75", "ovs_interfaceid": "4130ae9c-75bf-4b86-9a73-77d0424ede65", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.070 187132 DEBUG nova.network.os_vif_util [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.071 187132 DEBUG os_vif [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.072 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.073 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap4130ae9c-75, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.074 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.076 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.080 187132 INFO os_vif [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:d9:28:31,bridge_name='br-int',has_traffic_filtering=True,id=4130ae9c-75bf-4b86-9a73-77d0424ede65,network=Network(70d1c6fd-8452-4bef-babc-0687c3b7f28f),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap4130ae9c-75')#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.080 187132 INFO nova.virt.libvirt.driver [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Deleting instance files /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230_del#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.081 187132 INFO nova.virt.libvirt.driver [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Deletion of /var/lib/nova/instances/8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230_del complete#033[00m
Dec 11 01:19:46 np0005554845 podman[224594]: 2025-12-11 06:19:46.122595126 +0000 UTC m=+0.038497686 container remove a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f, org.label-schema.vendor=CentOS, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.127 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a55435d9-6661-4a59-8a5a-74df59eb0739]: (4, ('Thu Dec 11 06:19:45 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f (a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79)\na0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79\nThu Dec 11 06:19:46 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f (a0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79)\na0387b997456800e84020de09ae927f5e446074e647249a5d27cefe6450b5c79\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.129 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bb199d6b-f7c3-47e7-9a3b-5f98e95788d1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.130 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap70d1c6fd-80, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:46 np0005554845 kernel: tap70d1c6fd-80: left promiscuous mode
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.135 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.136 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.139 187132 INFO nova.compute.manager [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Took 0.36 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.140 187132 DEBUG oslo.service.loopingcall [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.140 187132 DEBUG nova.compute.manager [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.140 187132 DEBUG nova.network.neutron [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.140 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[af2f3a44-68d6-42b8-a727-ab52da8f4a65]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 nova_compute[187128]: 2025-12-11 06:19:46.144 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.168 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3743de9e-9250-4055-af59-a4159675477b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.170 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[41179cbf-0574-41e8-bf30-452960e9ab95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.190 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[51ea2290-0a86-4b73-8081-876b3e548313]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424059, 'reachable_time': 40675, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224609, 'error': None, 'target': 'ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.193 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-70d1c6fd-8452-4bef-babc-0687c3b7f28f deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.194 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[ea9b976d-58bb-44e6-83a4-8a7ba14eb458]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.194 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4130ae9c-75bf-4b86-9a73-77d0424ede65 in datapath 70d1c6fd-8452-4bef-babc-0687c3b7f28f unbound from our chassis#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.195 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70d1c6fd-8452-4bef-babc-0687c3b7f28f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:19:46 np0005554845 systemd[1]: run-netns-ovnmeta\x2d70d1c6fd\x2d8452\x2d4bef\x2dbabc\x2d0687c3b7f28f.mount: Deactivated successfully.
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.196 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3ad77b6b-baf9-4dfb-9c42-909e99813dfd]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.196 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 4130ae9c-75bf-4b86-9a73-77d0424ede65 in datapath 70d1c6fd-8452-4bef-babc-0687c3b7f28f unbound from our chassis#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.197 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 70d1c6fd-8452-4bef-babc-0687c3b7f28f, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:19:46 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:46.198 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8958c6f8-e2fb-4f78-8452-4c523dd288d2]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.740 187132 DEBUG nova.network.neutron [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.764 187132 INFO nova.compute.manager [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Took 2.62 seconds to deallocate network for instance.#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.843 187132 DEBUG nova.compute.manager [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.843 187132 DEBUG oslo_concurrency.lockutils [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.844 187132 DEBUG oslo_concurrency.lockutils [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.844 187132 DEBUG oslo_concurrency.lockutils [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.844 187132 DEBUG nova.compute.manager [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] No waiting events found dispatching network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.845 187132 WARNING nova.compute.manager [req-cd3fa945-6356-4b46-9c78-bab8e5ca346e req-74fbb99a-9b69-4585-84a4-d168dbaf280f eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received unexpected event network-vif-plugged-4130ae9c-75bf-4b86-9a73-77d0424ede65 for instance with vm_state active and task_state deleting.#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.853 187132 DEBUG nova.compute.manager [req-313cd188-1a12-4117-906a-02b50a71787f req-eaca3592-da3f-495a-9f91-94785a6e06f4 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Received event network-vif-deleted-4130ae9c-75bf-4b86-9a73-77d0424ede65 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.856 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.857 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.964 187132 DEBUG nova.compute.provider_tree [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:19:48 np0005554845 nova_compute[187128]: 2025-12-11 06:19:48.980 187132 DEBUG nova.scheduler.client.report [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:19:49 np0005554845 nova_compute[187128]: 2025-12-11 06:19:49.118 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.262s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:49 np0005554845 nova_compute[187128]: 2025-12-11 06:19:49.144 187132 INFO nova.scheduler.client.report [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Deleted allocations for instance 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230#033[00m
Dec 11 01:19:49 np0005554845 nova_compute[187128]: 2025-12-11 06:19:49.202 187132 DEBUG oslo_concurrency.lockutils [None req-7c896666-90e5-4bfc-bdf4-cb6ff7524bcd 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 3.428s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:51 np0005554845 nova_compute[187128]: 2025-12-11 06:19:51.039 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:51 np0005554845 nova_compute[187128]: 2025-12-11 06:19:51.074 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:53.590 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=18, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=17) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:53 np0005554845 nova_compute[187128]: 2025-12-11 06:19:53.591 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:53 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:53.591 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 10 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:19:56 np0005554845 nova_compute[187128]: 2025-12-11 06:19:56.041 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:56 np0005554845 nova_compute[187128]: 2025-12-11 06:19:56.075 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.224 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.225 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.226 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.226 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.226 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.228 187132 INFO nova.compute.manager [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Terminating instance#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.229 187132 DEBUG nova.compute.manager [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:19:58 np0005554845 kernel: tape34c30d9-f9 (unregistering): left promiscuous mode
Dec 11 01:19:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:58Z|00306|binding|INFO|Releasing lport e34c30d9-f946-403a-8b26-75716a1be5df from this chassis (sb_readonly=0)
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.256 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:58Z|00307|binding|INFO|Setting lport e34c30d9-f946-403a-8b26-75716a1be5df down in Southbound
Dec 11 01:19:58 np0005554845 NetworkManager[55529]: <info>  [1765433998.2572] device (tape34c30d9-f9): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:19:58 np0005554845 ovn_controller[95428]: 2025-12-11T06:19:58Z|00308|binding|INFO|Removing iface tape34c30d9-f9 ovn-installed in OVS
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.259 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.268 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:06:8b:82 10.100.0.9 2001:db8:0:1:f816:3eff:fe06:8b82 2001:db8::f816:3eff:fe06:8b82'], port_security=['fa:16:3e:06:8b:82 10.100.0.9 2001:db8:0:1:f816:3eff:fe06:8b82 2001:db8::f816:3eff:fe06:8b82'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.9/28 2001:db8:0:1:f816:3eff:fe06:8b82/64 2001:db8::f816:3eff:fe06:8b82/64', 'neutron:device_id': 'ce6856f2-bbd2-465e-a1bd-4af8e2f38591', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-d06dc841-febe-4a7e-b747-ad772083d6d5', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '1338b0a4-34fa-4fa1-a3dd-77213283cd6e', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=39ee0a97-89df-4836-8fd5-1fa735eea42f, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=e34c30d9-f946-403a-8b26-75716a1be5df) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.270 104320 INFO neutron.agent.ovn.metadata.agent [-] Port e34c30d9-f946-403a-8b26-75716a1be5df in datapath d06dc841-febe-4a7e-b747-ad772083d6d5 unbound from our chassis#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.272 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network d06dc841-febe-4a7e-b747-ad772083d6d5, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.274 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.273 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[98b3e2c7-528e-42ff-ba8c-a87023929431]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.275 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5 namespace which is not needed anymore#033[00m
Dec 11 01:19:58 np0005554845 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002f.scope: Deactivated successfully.
Dec 11 01:19:58 np0005554845 systemd[1]: machine-qemu\x2d23\x2dinstance\x2d0000002f.scope: Consumed 13.141s CPU time.
Dec 11 01:19:58 np0005554845 systemd-machined[153381]: Machine qemu-23-instance-0000002f terminated.
Dec 11 01:19:58 np0005554845 podman[224612]: 2025-12-11 06:19:58.34714523 +0000 UTC m=+0.059447375 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.351 187132 DEBUG nova.compute.manager [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.351 187132 DEBUG nova.compute.manager [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing instance network info cache due to event network-changed-e34c30d9-f946-403a-8b26-75716a1be5df. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.351 187132 DEBUG oslo_concurrency.lockutils [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.351 187132 DEBUG oslo_concurrency.lockutils [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.352 187132 DEBUG nova.network.neutron [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Refreshing network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:19:58 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [NOTICE]   (224349) : haproxy version is 2.8.14-c23fe91
Dec 11 01:19:58 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [NOTICE]   (224349) : path to executable is /usr/sbin/haproxy
Dec 11 01:19:58 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [WARNING]  (224349) : Exiting Master process...
Dec 11 01:19:58 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [ALERT]    (224349) : Current worker (224351) exited with code 143 (Terminated)
Dec 11 01:19:58 np0005554845 neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5[224345]: [WARNING]  (224349) : All workers exited. Exiting... (0)
Dec 11 01:19:58 np0005554845 systemd[1]: libpod-87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a.scope: Deactivated successfully.
Dec 11 01:19:58 np0005554845 podman[224656]: 2025-12-11 06:19:58.425754205 +0000 UTC m=+0.048771266 container died 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3)
Dec 11 01:19:58 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a-userdata-shm.mount: Deactivated successfully.
Dec 11 01:19:58 np0005554845 systemd[1]: var-lib-containers-storage-overlay-415bd6cb559c559d4a5b7384c5df687937c2927b93874dd2081fdc3dbf610749-merged.mount: Deactivated successfully.
Dec 11 01:19:58 np0005554845 podman[224656]: 2025-12-11 06:19:58.469252516 +0000 UTC m=+0.092269567 container cleanup 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:19:58 np0005554845 systemd[1]: libpod-conmon-87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a.scope: Deactivated successfully.
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.492 187132 INFO nova.virt.libvirt.driver [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Instance destroyed successfully.#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.492 187132 DEBUG nova.objects.instance [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid ce6856f2-bbd2-465e-a1bd-4af8e2f38591 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.507 187132 DEBUG nova.virt.libvirt.vif [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:19:18Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-2028969009',display_name='tempest-TestGettingAddress-server-2028969009',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-2028969009',id=47,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBEEUaNKC1fg61QRRrAyHgRKiPKfEO7WxkneeQc0tv515uDr9lv/Nkq7yTmkNzmXruBDzaCGCnBqkZmh2CfljQIGqPjTR+62tkla/qpVrvL9f2FwN38U1AwR6cA9d81fr6A==',key_name='tempest-TestGettingAddress-305222553',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:19:32Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-lnegqfzi',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:19:32Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=ce6856f2-bbd2-465e-a1bd-4af8e2f38591,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.507 187132 DEBUG nova.network.os_vif_util [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.184", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.508 187132 DEBUG nova.network.os_vif_util [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.508 187132 DEBUG os_vif [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.509 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.509 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape34c30d9-f9, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.511 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.512 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.514 187132 INFO os_vif [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:06:8b:82,bridge_name='br-int',has_traffic_filtering=True,id=e34c30d9-f946-403a-8b26-75716a1be5df,network=Network(d06dc841-febe-4a7e-b747-ad772083d6d5),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape34c30d9-f9')#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.515 187132 INFO nova.virt.libvirt.driver [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Deleting instance files /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591_del#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.515 187132 INFO nova.virt.libvirt.driver [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Deletion of /var/lib/nova/instances/ce6856f2-bbd2-465e-a1bd-4af8e2f38591_del complete#033[00m
Dec 11 01:19:58 np0005554845 podman[224699]: 2025-12-11 06:19:58.532809342 +0000 UTC m=+0.044990883 container remove 87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.539 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[73648a83-9d2f-44d6-939e-0fc217e629d9]: (4, ('Thu Dec 11 06:19:58 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5 (87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a)\n87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a\nThu Dec 11 06:19:58 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5 (87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a)\n87a4723ed509b92775d5f633e3ea92ddb4f375490aa8d84fa5b36dc2901d1d9a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.541 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0f0ae2ef-76a9-450b-a94f-79625e186c7f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.544 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapd06dc841-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:19:58 np0005554845 kernel: tapd06dc841-f0: left promiscuous mode
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.547 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.549 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.554 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[232b32da-f70b-49cd-968a-0bd8e6cd64c9]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.561 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.576 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7cb25a77-0387-4e30-940b-4ecf4f06f804]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.578 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[0129fd76-9f0f-4ee1-80f0-49d70ac17a95]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.583 187132 INFO nova.compute.manager [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Took 0.35 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.583 187132 DEBUG oslo.service.loopingcall [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.584 187132 DEBUG nova.compute.manager [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:19:58 np0005554845 nova_compute[187128]: 2025-12-11 06:19:58.584 187132 DEBUG nova.network.neutron [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.595 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5f96ea8a-7920-4e40-932a-57292134181b]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 424581, 'reachable_time': 24766, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 224716, 'error': None, 'target': 'ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:58 np0005554845 systemd[1]: run-netns-ovnmeta\x2dd06dc841\x2dfebe\x2d4a7e\x2db747\x2dad772083d6d5.mount: Deactivated successfully.
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.600 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-d06dc841-febe-4a7e-b747-ad772083d6d5 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:19:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:19:58.600 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[a2f4bb1f-77bd-4765-8412-0c9c695cc0be]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:19:59 np0005554845 nova_compute[187128]: 2025-12-11 06:19:59.909 187132 DEBUG nova.network.neutron [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:19:59 np0005554845 nova_compute[187128]: 2025-12-11 06:19:59.928 187132 INFO nova.compute.manager [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Took 1.34 seconds to deallocate network for instance.#033[00m
Dec 11 01:19:59 np0005554845 nova_compute[187128]: 2025-12-11 06:19:59.987 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:19:59 np0005554845 nova_compute[187128]: 2025-12-11 06:19:59.988 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.065 187132 DEBUG nova.compute.provider_tree [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.089 187132 DEBUG nova.scheduler.client.report [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.129 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.141s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.189 187132 INFO nova.scheduler.client.report [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance ce6856f2-bbd2-465e-a1bd-4af8e2f38591#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.291 187132 DEBUG oslo_concurrency.lockutils [None req-24710324-b467-4df7-be24-5d2e1e377c74 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 2.066s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.452 187132 DEBUG nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-vif-unplugged-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.453 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.453 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.454 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.454 187132 DEBUG nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] No waiting events found dispatching network-vif-unplugged-e34c30d9-f946-403a-8b26-75716a1be5df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.455 187132 WARNING nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received unexpected event network-vif-unplugged-e34c30d9-f946-403a-8b26-75716a1be5df for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.455 187132 DEBUG nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.456 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.456 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.457 187132 DEBUG oslo_concurrency.lockutils [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "ce6856f2-bbd2-465e-a1bd-4af8e2f38591-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.458 187132 DEBUG nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] No waiting events found dispatching network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.458 187132 WARNING nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received unexpected event network-vif-plugged-e34c30d9-f946-403a-8b26-75716a1be5df for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.459 187132 DEBUG nova.compute.manager [req-fb250db2-607b-4837-98de-d35b5f145bd6 req-82e976f2-4d0f-491b-8325-77200eba4ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Received event network-vif-deleted-e34c30d9-f946-403a-8b26-75716a1be5df external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.583 187132 DEBUG nova.network.neutron [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updated VIF entry in instance network info cache for port e34c30d9-f946-403a-8b26-75716a1be5df. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.584 187132 DEBUG nova.network.neutron [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Updating instance_info_cache with network_info: [{"id": "e34c30d9-f946-403a-8b26-75716a1be5df", "address": "fa:16:3e:06:8b:82", "network": {"id": "d06dc841-febe-4a7e-b747-ad772083d6d5", "bridge": "br-int", "label": "tempest-network-smoke--1236009285", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.9", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8:0:1::/64", "dns": [], "gateway": {"address": "2001:db8:0:1::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8:0:1:f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe06:8b82", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape34c30d9-f9", "ovs_interfaceid": "e34c30d9-f946-403a-8b26-75716a1be5df", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:20:00 np0005554845 nova_compute[187128]: 2025-12-11 06:20:00.616 187132 DEBUG oslo_concurrency.lockutils [req-6a2455f8-faf4-4770-b724-485f74575e46 req-27581b6f-ce91-4cea-80ee-f41b7ec39d7e eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-ce6856f2-bbd2-465e-a1bd-4af8e2f38591" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.044 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.052 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433986.0518544, 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.053 187132 INFO nova.compute.manager [-] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.078 187132 DEBUG nova.compute.manager [None req-911a90a0-7e3a-4209-b748-e14cb74a6dd1 - - - - - -] [instance: 8ff7bdb5-acb5-46eb-befe-b0ef5ecc9230] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.122 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:01 np0005554845 nova_compute[187128]: 2025-12-11 06:20:01.231 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:03 np0005554845 nova_compute[187128]: 2025-12-11 06:20:03.512 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:03 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:03.593 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '18'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:20:04 np0005554845 podman[224719]: 2025-12-11 06:20:04.159569661 +0000 UTC m=+0.091690550 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:20:06 np0005554845 nova_compute[187128]: 2025-12-11 06:20:06.045 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:06 np0005554845 podman[224739]: 2025-12-11 06:20:06.142116226 +0000 UTC m=+0.073380353 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:20:06 np0005554845 podman[224738]: 2025-12-11 06:20:06.151697286 +0000 UTC m=+0.086148430 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 01:20:08 np0005554845 nova_compute[187128]: 2025-12-11 06:20:08.515 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:10 np0005554845 podman[224785]: 2025-12-11 06:20:10.154587801 +0000 UTC m=+0.083250862 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.license=GPLv2)
Dec 11 01:20:11 np0005554845 nova_compute[187128]: 2025-12-11 06:20:11.046 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:13 np0005554845 nova_compute[187128]: 2025-12-11 06:20:13.491 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765433998.4901037, ce6856f2-bbd2-465e-a1bd-4af8e2f38591 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:20:13 np0005554845 nova_compute[187128]: 2025-12-11 06:20:13.491 187132 INFO nova.compute.manager [-] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:20:13 np0005554845 nova_compute[187128]: 2025-12-11 06:20:13.518 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:13 np0005554845 nova_compute[187128]: 2025-12-11 06:20:13.550 187132 DEBUG nova.compute.manager [None req-f567b2ed-8482-4207-b26d-5cb5a4b40bc2 - - - - - -] [instance: ce6856f2-bbd2-465e-a1bd-4af8e2f38591] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:20:15 np0005554845 podman[224808]: 2025-12-11 06:20:15.15571933 +0000 UTC m=+0.075319696 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:20:15 np0005554845 podman[224809]: 2025-12-11 06:20:15.163238874 +0000 UTC m=+0.079309264 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-type=git, distribution-scope=public, build-date=2025-08-20T13:12:41, config_id=edpm, io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, vendor=Red Hat, Inc., version=9.6, maintainer=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, managed_by=edpm_ansible, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 11 01:20:16 np0005554845 nova_compute[187128]: 2025-12-11 06:20:16.049 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:18 np0005554845 nova_compute[187128]: 2025-12-11 06:20:18.522 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:21 np0005554845 nova_compute[187128]: 2025-12-11 06:20:21.052 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:23 np0005554845 nova_compute[187128]: 2025-12-11 06:20:23.525 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:26 np0005554845 nova_compute[187128]: 2025-12-11 06:20:26.054 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:26.232 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:20:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:26.233 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:26.233 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:28 np0005554845 nova_compute[187128]: 2025-12-11 06:20:28.528 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:29 np0005554845 podman[224853]: 2025-12-11 06:20:29.126353347 +0000 UTC m=+0.055725814 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:20:31 np0005554845 nova_compute[187128]: 2025-12-11 06:20:31.055 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:33 np0005554845 nova_compute[187128]: 2025-12-11 06:20:33.531 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:34.003 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:b3:80:09 10.100.0.2 2001:db8::f816:3eff:feb3:8009'], port_security=[], type=localport, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': ''}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.2/28 2001:db8::f816:3eff:feb3:8009/64', 'neutron:device_id': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '3', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebfe8210-cfcf-46a1-b43f-6aa124b16ea7, chassis=[], tunnel_key=1, gateway_chassis=[], requested_chassis=[], logical_port=2d966d1a-0465-4e83-ab2f-abd82b79bd79) old=Port_Binding(mac=['fa:16:3e:b3:80:09 10.100.0.2'], external_ids={'neutron:cidrs': '10.100.0.2/28', 'neutron:device_id': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:device_owner': 'network:distributed', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:20:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:34.005 104320 INFO neutron.agent.ovn.metadata.agent [-] Metadata Port 2d966d1a-0465-4e83-ab2f-abd82b79bd79 in datapath 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 updated#033[00m
Dec 11 01:20:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:34.007 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:20:34 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:34.008 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5baa1adc-ed96-4c2f-8b18-2654b2305db3]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:20:34 np0005554845 nova_compute[187128]: 2025-12-11 06:20:34.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:35 np0005554845 podman[224878]: 2025-12-11 06:20:35.135785367 +0000 UTC m=+0.061450429 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:20:35 np0005554845 nova_compute[187128]: 2025-12-11 06:20:35.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:35 np0005554845 nova_compute[187128]: 2025-12-11 06:20:35.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:35 np0005554845 nova_compute[187128]: 2025-12-11 06:20:35.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.058 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.103 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.104 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.104 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.104 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.329 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.330 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5699MB free_disk=73.29163360595703GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.330 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.330 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.390 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.391 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.425 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.453 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.454 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.469 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.505 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.532 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.601 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.623 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:20:36 np0005554845 nova_compute[187128]: 2025-12-11 06:20:36.624 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.293s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:20:37 np0005554845 podman[224900]: 2025-12-11 06:20:37.116730118 +0000 UTC m=+0.051187492 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:20:37 np0005554845 podman[224901]: 2025-12-11 06:20:37.16398794 +0000 UTC m=+0.092897303 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, config_id=ovn_controller, managed_by=edpm_ansible)
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.534 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.624 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.624 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.625 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:20:38 np0005554845 nova_compute[187128]: 2025-12-11 06:20:38.715 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:20:39 np0005554845 nova_compute[187128]: 2025-12-11 06:20:39.711 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:40 np0005554845 nova_compute[187128]: 2025-12-11 06:20:40.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:40 np0005554845 nova_compute[187128]: 2025-12-11 06:20:40.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:20:41 np0005554845 nova_compute[187128]: 2025-12-11 06:20:41.059 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:41 np0005554845 podman[224946]: 2025-12-11 06:20:41.132168524 +0000 UTC m=+0.064943115 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, container_name=multipathd)
Dec 11 01:20:43 np0005554845 nova_compute[187128]: 2025-12-11 06:20:43.548 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:45 np0005554845 ovn_controller[95428]: 2025-12-11T06:20:45Z|00309|memory_trim|INFO|Detected inactivity (last active 30004 ms ago): trimming memory
Dec 11 01:20:46 np0005554845 nova_compute[187128]: 2025-12-11 06:20:46.062 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:46 np0005554845 podman[224966]: 2025-12-11 06:20:46.12722353 +0000 UTC m=+0.058650703 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:20:46 np0005554845 podman[224967]: 2025-12-11 06:20:46.160290349 +0000 UTC m=+0.079771467 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, config_id=edpm, name=ubi9-minimal, build-date=2025-08-20T13:12:41, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, vendor=Red Hat, Inc., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vcs-type=git, io.buildah.version=1.33.7, container_name=openstack_network_exporter, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6)
Dec 11 01:20:48 np0005554845 nova_compute[187128]: 2025-12-11 06:20:48.550 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:51 np0005554845 nova_compute[187128]: 2025-12-11 06:20:51.064 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:53 np0005554845 nova_compute[187128]: 2025-12-11 06:20:53.553 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:54.239 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=19, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=18) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:20:54 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:20:54.240 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 6 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:20:54 np0005554845 nova_compute[187128]: 2025-12-11 06:20:54.261 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:56 np0005554845 nova_compute[187128]: 2025-12-11 06:20:56.067 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:20:58 np0005554845 nova_compute[187128]: 2025-12-11 06:20:58.555 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:00 np0005554845 podman[225011]: 2025-12-11 06:21:00.109184634 +0000 UTC m=+0.047626385 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:21:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:00.242 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '19'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:01 np0005554845 nova_compute[187128]: 2025-12-11 06:21:01.069 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:03 np0005554845 nova_compute[187128]: 2025-12-11 06:21:03.608 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:06 np0005554845 nova_compute[187128]: 2025-12-11 06:21:06.071 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:06 np0005554845 podman[225035]: 2025-12-11 06:21:06.112065535 +0000 UTC m=+0.049841374 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:21:08 np0005554845 podman[225056]: 2025-12-11 06:21:08.124305555 +0000 UTC m=+0.053167664 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:21:08 np0005554845 podman[225057]: 2025-12-11 06:21:08.176478712 +0000 UTC m=+0.098164407 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 11 01:21:08 np0005554845 nova_compute[187128]: 2025-12-11 06:21:08.608 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:11 np0005554845 nova_compute[187128]: 2025-12-11 06:21:11.117 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.015 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.016 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.035 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.124 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.125 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.133 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.133 187132 INFO nova.compute.claims [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:21:12 np0005554845 podman[225101]: 2025-12-11 06:21:12.146193285 +0000 UTC m=+0.071099682 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.259 187132 DEBUG nova.compute.provider_tree [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.277 187132 DEBUG nova.scheduler.client.report [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.300 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.176s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.301 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.336 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.337 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.358 187132 INFO nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.372 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.468 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.469 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.469 187132 INFO nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Creating image(s)#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.470 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.470 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.471 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.482 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.541 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.543 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.544 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.569 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.631 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.632 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.667 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk 1073741824" returned: 0 in 0.034s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.669 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.125s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.669 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.728 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.730 187132 DEBUG nova.virt.disk.api [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Checking if we can resize image /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.731 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.806 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json" returned: 0 in 0.075s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.807 187132 DEBUG nova.virt.disk.api [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Cannot resize image /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.808 187132 DEBUG nova.objects.instance [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'migration_context' on Instance uuid 353ce6a3-8cb3-4e0c-962f-ee42e8483664 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.825 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.825 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Ensure instance console log exists: /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.826 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.827 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.827 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:12 np0005554845 nova_compute[187128]: 2025-12-11 06:21:12.886 187132 DEBUG nova.policy [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:21:13 np0005554845 nova_compute[187128]: 2025-12-11 06:21:13.610 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:15 np0005554845 nova_compute[187128]: 2025-12-11 06:21:15.425 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Successfully created port: e1cbb3cc-b72b-471e-9234-3eed622db912 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:21:16 np0005554845 nova_compute[187128]: 2025-12-11 06:21:16.120 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:16 np0005554845 nova_compute[187128]: 2025-12-11 06:21:16.991 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Successfully updated port: e1cbb3cc-b72b-471e-9234-3eed622db912 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.017 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.017 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquired lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.017 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:21:17 np0005554845 podman[225134]: 2025-12-11 06:21:17.1426954 +0000 UTC m=+0.060116434 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:21:17 np0005554845 podman[225135]: 2025-12-11 06:21:17.172456187 +0000 UTC m=+0.086172261 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, build-date=2025-08-20T13:12:41, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, distribution-scope=public, io.openshift.expose-services=)
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.220 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.346 187132 DEBUG nova.compute.manager [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.346 187132 DEBUG nova.compute.manager [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing instance network info cache due to event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:21:17 np0005554845 nova_compute[187128]: 2025-12-11 06:21:17.347 187132 DEBUG oslo_concurrency.lockutils [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:18 np0005554845 nova_compute[187128]: 2025-12-11 06:21:18.654 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.130 187132 DEBUG nova.network.neutron [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.158 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Releasing lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.159 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Instance network_info: |[{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.159 187132 DEBUG oslo_concurrency.lockutils [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.160 187132 DEBUG nova.network.neutron [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.163 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Start _get_guest_xml network_info=[{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.169 187132 WARNING nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.174 187132 DEBUG nova.virt.libvirt.host [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.175 187132 DEBUG nova.virt.libvirt.host [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.177 187132 DEBUG nova.virt.libvirt.host [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.178 187132 DEBUG nova.virt.libvirt.host [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.180 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.180 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.181 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.181 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.181 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.181 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.182 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.182 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.182 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.182 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.183 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.183 187132 DEBUG nova.virt.hardware [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.187 187132 DEBUG nova.virt.libvirt.vif [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-245562475',display_name='tempest-TestGettingAddress-server-245562475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-245562475',id=50,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHwLwmbXzLhtx3QXDuN5RkoUozSBRT+uWfsT30PdOf4/6PFp0T6c3r6yQuJi9Su0GF8tbhpa3DTVgljsT6UKFBBTDUvunUKiKj2BIGYlmQqceEatpHrQFERUHlKKGvwOg==',key_name='tempest-TestGettingAddress-1710968490',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e50j7w4d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:21:12Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=353ce6a3-8cb3-4e0c-962f-ee42e8483664,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.187 187132 DEBUG nova.network.os_vif_util [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.188 187132 DEBUG nova.network.os_vif_util [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.189 187132 DEBUG nova.objects.instance [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'pci_devices' on Instance uuid 353ce6a3-8cb3-4e0c-962f-ee42e8483664 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.230 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <uuid>353ce6a3-8cb3-4e0c-962f-ee42e8483664</uuid>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <name>instance-00000032</name>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:name>tempest-TestGettingAddress-server-245562475</nova:name>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:21:19</nova:creationTime>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:user uuid="60e9372de4754580913a836e11b9c248">tempest-TestGettingAddress-725523770-project-member</nova:user>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:project uuid="79a211a6fc3c4f68b6c3d0ba433964d3">tempest-TestGettingAddress-725523770</nova:project>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        <nova:port uuid="e1cbb3cc-b72b-471e-9234-3eed622db912">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.13" ipVersion="4"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="2001:db8::f816:3eff:fe58:2e2d" ipVersion="6"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="serial">353ce6a3-8cb3-4e0c-962f-ee42e8483664</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="uuid">353ce6a3-8cb3-4e0c-962f-ee42e8483664</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.config"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:58:2e:2d"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <target dev="tape1cbb3cc-b7"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/console.log" append="off"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:21:19 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:21:19 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:21:19 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:21:19 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.232 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Preparing to wait for external event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.232 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.233 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.233 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.233 187132 DEBUG nova.virt.libvirt.vif [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-TestGettingAddress-server-245562475',display_name='tempest-TestGettingAddress-server-245562475',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-245562475',id=50,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHwLwmbXzLhtx3QXDuN5RkoUozSBRT+uWfsT30PdOf4/6PFp0T6c3r6yQuJi9Su0GF8tbhpa3DTVgljsT6UKFBBTDUvunUKiKj2BIGYlmQqceEatpHrQFERUHlKKGvwOg==',key_name='tempest-TestGettingAddress-1710968490',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e50j7w4d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:21:12Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=353ce6a3-8cb3-4e0c-962f-ee42e8483664,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.234 187132 DEBUG nova.network.os_vif_util [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.235 187132 DEBUG nova.network.os_vif_util [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.235 187132 DEBUG os_vif [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.236 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.236 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.236 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.241 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.242 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tape1cbb3cc-b7, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.242 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tape1cbb3cc-b7, col_values=(('external_ids', {'iface-id': 'e1cbb3cc-b72b-471e-9234-3eed622db912', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:58:2e:2d', 'vm-uuid': '353ce6a3-8cb3-4e0c-962f-ee42e8483664'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.244 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 NetworkManager[55529]: <info>  [1765434079.2451] manager: (tape1cbb3cc-b7): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/157)
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.245 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.252 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.253 187132 INFO os_vif [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7')#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.366 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.367 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.367 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] No VIF found with MAC fa:16:3e:58:2e:2d, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.368 187132 INFO nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Using config drive#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.729 187132 INFO nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Creating config drive at /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.config#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.735 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc4qaum77 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.874 187132 DEBUG oslo_concurrency.processutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpc4qaum77" returned: 0 in 0.140s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:19 np0005554845 kernel: tape1cbb3cc-b7: entered promiscuous mode
Dec 11 01:21:19 np0005554845 NetworkManager[55529]: <info>  [1765434079.9524] manager: (tape1cbb3cc-b7): new Tun device (/org/freedesktop/NetworkManager/Devices/158)
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.978 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:19Z|00310|binding|INFO|Claiming lport e1cbb3cc-b72b-471e-9234-3eed622db912 for this chassis.
Dec 11 01:21:19 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:19Z|00311|binding|INFO|e1cbb3cc-b72b-471e-9234-3eed622db912: Claiming fa:16:3e:58:2e:2d 10.100.0.13 2001:db8::f816:3eff:fe58:2e2d
Dec 11 01:21:19 np0005554845 systemd-udevd[225192]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.994 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:19 np0005554845 nova_compute[187128]: 2025-12-11 06:21:19.997 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.0002] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/159)
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.0020] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/160)
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.002 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:2e:2d 10.100.0.13 2001:db8::f816:3eff:fe58:2e2d'], port_security=['fa:16:3e:58:2e:2d 10.100.0.13 2001:db8::f816:3eff:fe58:2e2d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe58:2e2d/64', 'neutron:device_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '2', 'neutron:security_group_ids': '2fc6dc65-0cfe-4a46-8eed-41ba0eb7cf58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebfe8210-cfcf-46a1-b43f-6aa124b16ea7, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=e1cbb3cc-b72b-471e-9234-3eed622db912) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.003 104320 INFO neutron.agent.ovn.metadata.agent [-] Port e1cbb3cc-b72b-471e-9234-3eed622db912 in datapath 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 bound to our chassis#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.004 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569#033[00m
Dec 11 01:21:20 np0005554845 systemd-machined[153381]: New machine qemu-24-instance-00000032.
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.0131] device (tape1cbb3cc-b7): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.0155] device (tape1cbb3cc-b7): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.018 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ad9e7fe3-5d88-4a30-811b-8280829b4410]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.018 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap0f33fc2f-01 in ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.020 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap0f33fc2f-00 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.020 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b76e1c39-186c-47b0-a90e-7ad62d995928]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.021 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e75e4e0c-f35c-4bb4-8961-24123fb5ed66]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.034 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[06dd9276-b47c-4371-9848-2d88fa8d5622]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 systemd[1]: Started Virtual Machine qemu-24-instance-00000032.
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.066 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[2b09207b-1aa8-49c9-a386-43f6dc7cdc45]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.092 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[1b39738e-7d10-460c-81e6-f0ad5fdc5e57]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 systemd-udevd[225198]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.1070] manager: (tap0f33fc2f-00): new Veth device (/org/freedesktop/NetworkManager/Devices/161)
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.106 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8b8fe336-f215-4896-b44d-df1935a8a269]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.109 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.124 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:20Z|00312|binding|INFO|Setting lport e1cbb3cc-b72b-471e-9234-3eed622db912 ovn-installed in OVS
Dec 11 01:21:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:20Z|00313|binding|INFO|Setting lport e1cbb3cc-b72b-471e-9234-3eed622db912 up in Southbound
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.135 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.136 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[f2c0156a-237f-4bfe-866e-102fce7ab16e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.138 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[2c7b8f35-9fd6-4443-93be-960b3e750bc9]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.1607] device (tap0f33fc2f-00): carrier: link connected
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.165 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[d0b5b438-a6bc-45f2-bf89-4c8375beb4cb]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.181 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[43b07a6d-643a-4d2a-956b-1d30dd3868bd]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f33fc2f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:80:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435480, 'reachable_time': 44618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225228, 'error': None, 'target': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.195 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[928be264-a3d5-4c05-95fb-ad588b93981f]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feb3:8009'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 435480, 'tstamp': 435480}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225229, 'error': None, 'target': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.212 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b7748c55-760e-4bf4-b2fe-929414adc5ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap0f33fc2f-01'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:b3:80:09'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 94], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435480, 'reachable_time': 44618, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225230, 'error': None, 'target': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.243 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cf86de16-dfa1-4827-86aa-63c8028e47cf]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.319 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[e4ea9ae6-1136-4575-bc4e-46da3b83f93c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.321 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f33fc2f-00, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.321 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.322 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap0f33fc2f-00, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.324 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 NetworkManager[55529]: <info>  [1765434080.3257] manager: (tap0f33fc2f-00): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/162)
Dec 11 01:21:20 np0005554845 kernel: tap0f33fc2f-00: entered promiscuous mode
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.328 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.329 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap0f33fc2f-00, col_values=(('external_ids', {'iface-id': '2d966d1a-0465-4e83-ab2f-abd82b79bd79'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.331 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:20Z|00314|binding|INFO|Releasing lport 2d966d1a-0465-4e83-ab2f-abd82b79bd79 from this chassis (sb_readonly=0)
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.355 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.356 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/0f33fc2f-0e53-4d9a-bdd8-75ef4128a569.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/0f33fc2f-0e53-4d9a-bdd8-75ef4128a569.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.357 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b10f08f0-468d-40f9-be46-2366ba9a4ee5]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.358 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/0f33fc2f-0e53-4d9a-bdd8-75ef4128a569.pid.haproxy
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:21:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:20.360 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'env', 'PROCESS_TAG=haproxy-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/0f33fc2f-0e53-4d9a-bdd8-75ef4128a569.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.468 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434080.467616, 353ce6a3-8cb3-4e0c-962f-ee42e8483664 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.469 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] VM Started (Lifecycle Event)#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.693 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.699 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434080.4678624, 353ce6a3-8cb3-4e0c-962f-ee42e8483664 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.700 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.737 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.742 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:21:20 np0005554845 nova_compute[187128]: 2025-12-11 06:21:20.762 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:21:20 np0005554845 podman[225268]: 2025-12-11 06:21:20.802495928 +0000 UTC m=+0.065757977 container create 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:21:20 np0005554845 systemd[1]: Started libpod-conmon-7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249.scope.
Dec 11 01:21:20 np0005554845 podman[225268]: 2025-12-11 06:21:20.774432146 +0000 UTC m=+0.037694215 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:21:20 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:21:20 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/556507203e68ef2740e636b1160b303b2ee950254660ab21e34829c87c7689ef/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:21:20 np0005554845 podman[225268]: 2025-12-11 06:21:20.896220073 +0000 UTC m=+0.159482212 container init 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:21:20 np0005554845 podman[225268]: 2025-12-11 06:21:20.90201569 +0000 UTC m=+0.165277769 container start 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202)
Dec 11 01:21:20 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [NOTICE]   (225287) : New worker (225289) forked
Dec 11 01:21:20 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [NOTICE]   (225287) : Loading success.
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.140 187132 DEBUG nova.compute.manager [req-040cce02-877e-419a-8908-c6510ad0d72c req-60aaf479-2c7f-470d-bf22-0866f6b6b25a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.141 187132 DEBUG oslo_concurrency.lockutils [req-040cce02-877e-419a-8908-c6510ad0d72c req-60aaf479-2c7f-470d-bf22-0866f6b6b25a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.142 187132 DEBUG oslo_concurrency.lockutils [req-040cce02-877e-419a-8908-c6510ad0d72c req-60aaf479-2c7f-470d-bf22-0866f6b6b25a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.142 187132 DEBUG oslo_concurrency.lockutils [req-040cce02-877e-419a-8908-c6510ad0d72c req-60aaf479-2c7f-470d-bf22-0866f6b6b25a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.143 187132 DEBUG nova.compute.manager [req-040cce02-877e-419a-8908-c6510ad0d72c req-60aaf479-2c7f-470d-bf22-0866f6b6b25a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Processing event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.144 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.149 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434081.1494431, 353ce6a3-8cb3-4e0c-962f-ee42e8483664 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.150 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.154 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.159 187132 INFO nova.virt.libvirt.driver [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Instance spawned successfully.#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.159 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.168 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.175 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.186 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.195 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.196 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.197 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.198 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.199 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.200 187132 DEBUG nova.virt.libvirt.driver [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.209 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.213 187132 DEBUG nova.network.neutron [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updated VIF entry in instance network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.213 187132 DEBUG nova.network.neutron [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.239 187132 DEBUG oslo_concurrency.lockutils [req-54826608-6c2d-4358-84b1-cbd098447428 req-1c84b819-18ca-46f6-9b58-0185a5570e5a eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.271 187132 INFO nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Took 8.80 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.272 187132 DEBUG nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.362 187132 INFO nova.compute.manager [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Took 9.27 seconds to build instance.#033[00m
Dec 11 01:21:21 np0005554845 nova_compute[187128]: 2025-12-11 06:21:21.377 187132 DEBUG oslo_concurrency.lockutils [None req-b804a745-1084-40f8-b9dc-6680ac667e58 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 9.361s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.221 187132 DEBUG nova.compute.manager [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.221 187132 DEBUG oslo_concurrency.lockutils [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.222 187132 DEBUG oslo_concurrency.lockutils [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.223 187132 DEBUG oslo_concurrency.lockutils [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.223 187132 DEBUG nova.compute.manager [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] No waiting events found dispatching network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:21:23 np0005554845 nova_compute[187128]: 2025-12-11 06:21:23.224 187132 WARNING nova.compute.manager [req-b24f42b9-f185-4f8e-86ae-b580a2c27648 req-df69b5cd-1319-4b27-a082-adce6cb46fca eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received unexpected event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:21:24 np0005554845 nova_compute[187128]: 2025-12-11 06:21:24.246 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.148 187132 DEBUG nova.compute.manager [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.149 187132 DEBUG nova.compute.manager [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing instance network info cache due to event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.149 187132 DEBUG oslo_concurrency.lockutils [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.150 187132 DEBUG oslo_concurrency.lockutils [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.150 187132 DEBUG nova.network.neutron [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:21:26 np0005554845 nova_compute[187128]: 2025-12-11 06:21:26.170 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:26.233 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:26.233 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:26.234 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:29 np0005554845 nova_compute[187128]: 2025-12-11 06:21:29.091 187132 DEBUG nova.network.neutron [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updated VIF entry in instance network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:21:29 np0005554845 nova_compute[187128]: 2025-12-11 06:21:29.092 187132 DEBUG nova.network.neutron [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:29 np0005554845 nova_compute[187128]: 2025-12-11 06:21:29.143 187132 DEBUG oslo_concurrency.lockutils [req-bb79d2c3-5420-4581-b0cb-11d6c2ff132c req-f1bf3a14-7a0b-4df3-a921-82aec9e77ed7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:29 np0005554845 nova_compute[187128]: 2025-12-11 06:21:29.248 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.103 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'name': 'tempest-TestGettingAddress-server-245562475', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000032', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'user_id': '60e9372de4754580913a836e11b9c248', 'hostId': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.104 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.104 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>]
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.105 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.105 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.105 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>]
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.106 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.140 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.bytes volume: 23775232 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.141 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.bytes volume: 2048 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1c198716-d0e1-4024-adbd-18645361891d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 23775232, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.106409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1ad5048-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': 'ebc0e5900b152c0e1817cb22bcad7ae4115f7152e7b0acbf2c8cdab0bc53d2d1'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 2048, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.106409', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1ad5e58-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '2ae25ccdf570fbe1a2b8069a8145cf7b84379ca09e944d4b9bd1cff457137ed3'}]}, 'timestamp': '2025-12-11 06:21:30.141432', '_unique_id': '155b2b32110c48dfa0d4cedb39f91f9b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.142 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.143 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.147 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 353ce6a3-8cb3-4e0c-962f-ee42e8483664 / tape1cbb3cc-b7 inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.148 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.incoming.bytes volume: 90 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '109a896e-97c9-436b-a8c5-31893dff8db5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 90, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.143741', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1ae7338-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '3e2a16e3f7259aaf6ec8278a52a193b86e50c55179bc5013092faf582752e0b8'}]}, 'timestamp': '2025-12-11 06:21:30.148540', '_unique_id': 'cd67f171843e422abc679b5fe9fec500'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.150 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.169 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/cpu volume: 8750000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a5df636c-c08d-40c2-96ae-6216e45f1349', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 8750000000, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'timestamp': '2025-12-11T06:21:30.150098', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'a1b1bf8e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.87117674, 'message_signature': 'efb18a1de9ea5b1ebd64dc3464778299d51f6b67aad9fc8f7c48353c7e8ac63e'}]}, 'timestamp': '2025-12-11 06:21:30.170116', '_unique_id': 'd92c366cca5d45c69901e273b852786c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.171 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.185 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.allocation volume: 204800 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.186 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cd06bab2-1074-4e85-a20e-99f05c43bbbf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 204800, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.171970', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b43ff2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': '6d0063c5d82f1a648212577b596f0d1b49d83e61b860cd029039f6ec2ab7b517'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.171970', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b44efc-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': 'f9f6c191451d3946105185bf3b8848e1ebce3a2c99decfce629ce61c2053dd0e'}]}, 'timestamp': '2025-12-11 06:21:30.186884', '_unique_id': 'a56114b704454283b75e84d0c07fefd4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.187 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.188 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.188 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '95642aea-451f-4574-9dd3-9446a3c17ea6', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.188888', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b4aa5a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': 'ea96aa0226834070215b0b93e8fb20d297e6e62a297505d86deda6bf61bba6bd'}]}, 'timestamp': '2025-12-11 06:21:30.189235', '_unique_id': '3f0847c164224d57905927e02fbc0c55'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.189 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.193 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.193 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4ed092ce-87a7-459a-8fa7-7cd45063cc2d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.193303', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b55a68-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '52d536a69d58af274396631a0219999009fc98da8c90448930a4bf87ff82c564'}]}, 'timestamp': '2025-12-11 06:21:30.193779', '_unique_id': 'f2673da385304ff7b7ed8bd663cf92f4'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.194 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.195 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.195 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.requests volume: 760 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.195 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.requests volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '38791766-4608-4805-b3b1-dcb4065f4a3f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 760, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.195671', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b5b2d8-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': 'bac1046b33fa571c669c3480ead90e5e9f77ca50fbd211e3286c873ee0ced0d6'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.195671', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b5bdfa-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '29c57bf5492503b95d53e2c742dacc1cd736ab20f9108a0e04fc37767d34f4e7'}]}, 'timestamp': '2025-12-11 06:21:30.196248', '_unique_id': '5a69f287ec72490faf1bcbf11086a78d'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.197 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.latency volume: 118348753 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.198 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.read.latency volume: 651987 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'df72e0eb-ef6b-4cae-9201-28269b4552fd', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 118348753, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.197890', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b60936-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '0cc6a4941e9381abb315d9008cf5f38b82fd03e0f9c82e9dd75312a7b9bd0f9b'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 651987, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.197890', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b616e2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': 'a24bc2c7e6d24d87fa38488eb2f94ce527cf1358a4d1ccdea8611561ed1ceab9'}]}, 'timestamp': '2025-12-11 06:21:30.198536', '_unique_id': '15595549be884420a262c36634a46be2'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.199 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.200 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/memory.usage volume: Unavailable _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.200 12 WARNING ceilometer.compute.pollsters [-] memory.usage statistic in not available for instance 353ce6a3-8cb3-4e0c-962f-ee42e8483664: ceilometer.compute.pollsters.NoVolumeException
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.200 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.200 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.outgoing.packets volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '9ae3e1e2-f7c1-4725-814a-2b1ebc22b6e7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.200340', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b66980-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': 'e6f4086e79ae4414bca8e37dfe8211af099f83699f1417efdf5b6290d7883ecd'}]}, 'timestamp': '2025-12-11 06:21:30.200656', '_unique_id': '6b7c272c844049fd9b030c4788e468c7'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.201 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.202 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.202 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.202 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '261530f5-f358-45f3-996e-14930e72e807', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.202187', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b6b14c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '90bd1a9b7624ceff53c31d113131807080e114e929d5a134c7f3ef1dac796e2d'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.202187', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b6bd5e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '7c896731810465d398b42aa834afd905f9c6d86eb5c034e31af3a6b63e7a48d0'}]}, 'timestamp': '2025-12-11 06:21:30.202791', '_unique_id': '5e582e044e184e8dbd9c82e5c86c9d0c'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.203 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.204 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.204 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ec1b43ce-73d3-432c-a5e1-7a16561d2172', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.204547', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b70dc2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '99d460a5653a996d22564620c098e120db7e04dabacbce909691ef00464bc6d0'}]}, 'timestamp': '2025-12-11 06:21:30.204880', '_unique_id': 'f5ffce1dc07f472c892ee3036225cef6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.205 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.206 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.206 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4546ad75-9c11-4ba5-8d08-2c5bcaba06c5', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.206697', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b761dc-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '056cb774e844b56a5eddf90b1249c8a24d1b9b1aed517e40672fead50acd7bbe'}]}, 'timestamp': '2025-12-11 06:21:30.207039', '_unique_id': 'ce0ec1ebb3284f83a6905386796609dd'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.208 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.208 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b56c5fb-bad2-436c-bf96-7b73e9935869', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.208662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b7ae12-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': '1a85703b51b56a843be73a867de06596556330e2b60273ecaffd02951f358ec3'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.208662', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b7b95c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': '45eeb2ac0715acc63259c7685399387b6409329b999aa17643aa5a2466639494'}]}, 'timestamp': '2025-12-11 06:21:30.209240', '_unique_id': 'f8f847c38e7649868460adc23f860d4b'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.209 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>]
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-TestGettingAddress-server-245562475>]
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.211 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4d7e450d-c98b-4f59-968b-4f2c03044182', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.211896', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b82cca-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '23f32f345a0067065ad1bae2521e88eccdb77fe83948b92690ca44156ca7a607'}]}, 'timestamp': '2025-12-11 06:21:30.212234', '_unique_id': '3b9c3eb3918a45d29c077e5643060384'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.213 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.incoming.packets volume: 1 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '6d55b90a-5f5a-479f-99c5-aaf063e8ceb8', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 1, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.213879', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b879d2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '7b9f32511554f0b9265ced271e5cd0b5e0cb351039126d18ee01ca8e8da9dbe1'}]}, 'timestamp': '2025-12-11 06:21:30.214232', '_unique_id': 'ba820eda616c4674a19115220b679d36'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.214 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.215 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b3878598-c10f-4a72-8269-3e523782f6e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.215761', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b8c3a6-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '8d2f785f1d799068da0c9995af357539221d38daae59d918aa798c1c6870a51b'}]}, 'timestamp': '2025-12-11 06:21:30.216081', '_unique_id': '5b8ca4ffa2ac4259b24398b7e901b780'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.217 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.217 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.217 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24eddee6-de12-4575-bbc0-872e4a27a3e2', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.217661', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b90dc0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': 'f9bd6b54d6af9865f8d047c04f7e2c34772ea75d854c9c2bf3b49d1ecd61857f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.217661', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b9189c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '3d8c4ccda8f5082d5abd198a14fa463a32d425df7c1a2c1640045e2fb43cdb25'}]}, 'timestamp': '2025-12-11 06:21:30.218262', '_unique_id': '44cb6ad9c36c46ff8d180a13bf30a3ac'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.218 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.219 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.219 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.220 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '060af604-86fb-40cc-a078-cd855078ec6d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.219830', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1b9620c-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': 'f9848c70fa300eb2612421a258df4a4ca7c672accdfce8c0e0112fd3fcc8ed5c'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.219830', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1b96f40-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.808106817, 'message_signature': '508ead0e4b5fb68ee123f2cf559e6672bbc2c057d667b2c549f30ac8e1a91a4e'}]}, 'timestamp': '2025-12-11 06:21:30.220478', '_unique_id': '925540a04cb44c2e8df48b275febfa09'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.221 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.222 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.222 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/network.outgoing.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'fde9c6be-4ddf-41cf-996f-eea2d6223832', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': 'instance-00000032-353ce6a3-8cb3-4e0c-962f-ee42e8483664-tape1cbb3cc-b7', 'timestamp': '2025-12-11T06:21:30.222114', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'tape1cbb3cc-b7', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:58:2e:2d', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tape1cbb3cc-b7'}, 'message_id': 'a1b9bd38-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.845444961, 'message_signature': '7cdf098dee28759d2608e6ef4eee01fb62bb8ed180dbfba62ce151caafa1fbd8'}]}, 'timestamp': '2025-12-11 06:21:30.222503', '_unique_id': 'c0320d2b1d594a8c85b5688f514dbd14'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.223 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.224 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.usage volume: 196624 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.224 12 DEBUG ceilometer.compute.pollsters [-] 353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '24032983-ad15-4566-8413-f81574b14148', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 196624, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-vda', 'timestamp': '2025-12-11T06:21:30.224063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'a1ba07a2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': '54fc540536c3bb3f37c70939b270e8dc31e4d52df5773e714e5ed754c0cdb482'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '60e9372de4754580913a836e11b9c248', 'user_name': None, 'project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'project_name': None, 'resource_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664-sda', 'timestamp': '2025-12-11T06:21:30.224063', 'resource_metadata': {'display_name': 'tempest-TestGettingAddress-server-245562475', 'name': 'instance-00000032', 'instance_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'instance_type': 'm1.nano', 'host': 'd98a1be539338ce6f2cc5e72d9a2a25d2af213d7070fa0682e850ee8', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'a1ba1382-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4364.873671217, 'message_signature': 'a12b7418f2adb0df712e434ca7979b6184c5d2770dc10265b8245bd95b7fbeb6'}]}, 'timestamp': '2025-12-11 06:21:30.224655', '_unique_id': '0f03dd9ddf264c61a998a4ce637c21af'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:21:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:21:30.225 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:21:31 np0005554845 podman[225299]: 2025-12-11 06:21:31.161459802 +0000 UTC m=+0.077364502 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:21:31 np0005554845 nova_compute[187128]: 2025-12-11 06:21:31.171 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:34 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:34Z|00044|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:58:2e:2d 10.100.0.13
Dec 11 01:21:34 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:34Z|00045|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:58:2e:2d 10.100.0.13
Dec 11 01:21:34 np0005554845 nova_compute[187128]: 2025-12-11 06:21:34.252 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:34 np0005554845 nova_compute[187128]: 2025-12-11 06:21:34.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:36 np0005554845 nova_compute[187128]: 2025-12-11 06:21:36.203 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:37 np0005554845 podman[225339]: 2025-12-11 06:21:37.178093227 +0000 UTC m=+0.097229841 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.946 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.947 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.947 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:37 np0005554845 nova_compute[187128]: 2025-12-11 06:21:37.948 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.044 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.143 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json" returned: 0 in 0.099s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.145 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.243 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664/disk --force-share --output=json" returned: 0 in 0.098s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.423 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.424 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5521MB free_disk=73.26295852661133GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.424 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.425 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.494 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 353ce6a3-8cb3-4e0c-962f-ee42e8483664 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.494 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.494 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.533 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.545 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.567 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:21:38 np0005554845 nova_compute[187128]: 2025-12-11 06:21:38.568 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.143s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:39 np0005554845 podman[225367]: 2025-12-11 06:21:39.127197602 +0000 UTC m=+0.051117568 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.license=GPLv2)
Dec 11 01:21:39 np0005554845 podman[225368]: 2025-12-11 06:21:39.183251334 +0000 UTC m=+0.101970349 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0)
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.253 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.569 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.569 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.569 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.853 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.853 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.853 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:21:39 np0005554845 nova_compute[187128]: 2025-12-11 06:21:39.854 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 353ce6a3-8cb3-4e0c-962f-ee42e8483664 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.206 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.592 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.612 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.612 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.612 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.612 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.613 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:41 np0005554845 nova_compute[187128]: 2025-12-11 06:21:41.613 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:21:42 np0005554845 nova_compute[187128]: 2025-12-11 06:21:42.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.112 187132 DEBUG nova.compute.manager [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.113 187132 DEBUG nova.compute.manager [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing instance network info cache due to event network-changed-e1cbb3cc-b72b-471e-9234-3eed622db912. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.113 187132 DEBUG oslo_concurrency.lockutils [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.113 187132 DEBUG oslo_concurrency.lockutils [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.114 187132 DEBUG nova.network.neutron [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Refreshing network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:21:43 np0005554845 podman[225409]: 2025-12-11 06:21:43.145448193 +0000 UTC m=+0.068600544 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.181 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.182 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.182 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.182 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.182 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.184 187132 INFO nova.compute.manager [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Terminating instance#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.184 187132 DEBUG nova.compute.manager [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:21:43 np0005554845 kernel: tape1cbb3cc-b7 (unregistering): left promiscuous mode
Dec 11 01:21:43 np0005554845 NetworkManager[55529]: <info>  [1765434103.2119] device (tape1cbb3cc-b7): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:21:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:43Z|00315|binding|INFO|Releasing lport e1cbb3cc-b72b-471e-9234-3eed622db912 from this chassis (sb_readonly=0)
Dec 11 01:21:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:43Z|00316|binding|INFO|Setting lport e1cbb3cc-b72b-471e-9234-3eed622db912 down in Southbound
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.223 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 ovn_controller[95428]: 2025-12-11T06:21:43Z|00317|binding|INFO|Removing iface tape1cbb3cc-b7 ovn-installed in OVS
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.229 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.235 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:58:2e:2d 10.100.0.13 2001:db8::f816:3eff:fe58:2e2d'], port_security=['fa:16:3e:58:2e:2d 10.100.0.13 2001:db8::f816:3eff:fe58:2e2d'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.13/28 2001:db8::f816:3eff:fe58:2e2d/64', 'neutron:device_id': '353ce6a3-8cb3-4e0c-962f-ee42e8483664', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '79a211a6fc3c4f68b6c3d0ba433964d3', 'neutron:revision_number': '4', 'neutron:security_group_ids': '2fc6dc65-0cfe-4a46-8eed-41ba0eb7cf58', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=ebfe8210-cfcf-46a1-b43f-6aa124b16ea7, chassis=[], tunnel_key=5, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=e1cbb3cc-b72b-471e-9234-3eed622db912) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.236 104320 INFO neutron.agent.ovn.metadata.agent [-] Port e1cbb3cc-b72b-471e-9234-3eed622db912 in datapath 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 unbound from our chassis#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.237 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.239 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[6248560b-5b66-4307-89df-8177444c0e71]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.239 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 namespace which is not needed anymore#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.249 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000032.scope: Deactivated successfully.
Dec 11 01:21:43 np0005554845 systemd[1]: machine-qemu\x2d24\x2dinstance\x2d00000032.scope: Consumed 13.530s CPU time.
Dec 11 01:21:43 np0005554845 systemd-machined[153381]: Machine qemu-24-instance-00000032 terminated.
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.405 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.411 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.445 187132 INFO nova.virt.libvirt.driver [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Instance destroyed successfully.#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.445 187132 DEBUG nova.objects.instance [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lazy-loading 'resources' on Instance uuid 353ce6a3-8cb3-4e0c-962f-ee42e8483664 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.458 187132 DEBUG nova.virt.libvirt.vif [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:21:09Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-TestGettingAddress-server-245562475',display_name='tempest-TestGettingAddress-server-245562475',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-testgettingaddress-server-245562475',id=50,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBAHwLwmbXzLhtx3QXDuN5RkoUozSBRT+uWfsT30PdOf4/6PFp0T6c3r6yQuJi9Su0GF8tbhpa3DTVgljsT6UKFBBTDUvunUKiKj2BIGYlmQqceEatpHrQFERUHlKKGvwOg==',key_name='tempest-TestGettingAddress-1710968490',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:21:21Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='79a211a6fc3c4f68b6c3d0ba433964d3',ramdisk_id='',reservation_id='r-e50j7w4d',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestGettingAddress-725523770',owner_user_name='tempest-TestGettingAddress-725523770-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:21:21Z,user_data=None,user_id='60e9372de4754580913a836e11b9c248',uuid=353ce6a3-8cb3-4e0c-962f-ee42e8483664,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.459 187132 DEBUG nova.network.os_vif_util [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converting VIF {"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.234", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.459 187132 DEBUG nova.network.os_vif_util [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.460 187132 DEBUG os_vif [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.461 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.461 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tape1cbb3cc-b7, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.463 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.464 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.466 187132 INFO os_vif [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:58:2e:2d,bridge_name='br-int',has_traffic_filtering=True,id=e1cbb3cc-b72b-471e-9234-3eed622db912,network=Network(0f33fc2f-0e53-4d9a-bdd8-75ef4128a569),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tape1cbb3cc-b7')#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.467 187132 INFO nova.virt.libvirt.driver [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Deleting instance files /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664_del#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.467 187132 INFO nova.virt.libvirt.driver [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Deletion of /var/lib/nova/instances/353ce6a3-8cb3-4e0c-962f-ee42e8483664_del complete#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.517 187132 INFO nova.compute.manager [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Took 0.33 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.518 187132 DEBUG oslo.service.loopingcall [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.518 187132 DEBUG nova.compute.manager [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.518 187132 DEBUG nova.network.neutron [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [NOTICE]   (225287) : haproxy version is 2.8.14-c23fe91
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [NOTICE]   (225287) : path to executable is /usr/sbin/haproxy
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [WARNING]  (225287) : Exiting Master process...
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [WARNING]  (225287) : Exiting Master process...
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [ALERT]    (225287) : Current worker (225289) exited with code 143 (Terminated)
Dec 11 01:21:43 np0005554845 neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569[225283]: [WARNING]  (225287) : All workers exited. Exiting... (0)
Dec 11 01:21:43 np0005554845 systemd[1]: libpod-7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249.scope: Deactivated successfully.
Dec 11 01:21:43 np0005554845 podman[225454]: 2025-12-11 06:21:43.576267042 +0000 UTC m=+0.250743150 container died 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, org.label-schema.build-date=20251202, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:21:43 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249-userdata-shm.mount: Deactivated successfully.
Dec 11 01:21:43 np0005554845 systemd[1]: var-lib-containers-storage-overlay-556507203e68ef2740e636b1160b303b2ee950254660ab21e34829c87c7689ef-merged.mount: Deactivated successfully.
Dec 11 01:21:43 np0005554845 podman[225454]: 2025-12-11 06:21:43.899535969 +0000 UTC m=+0.574012087 container cleanup 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:21:43 np0005554845 systemd[1]: libpod-conmon-7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249.scope: Deactivated successfully.
Dec 11 01:21:43 np0005554845 podman[225499]: 2025-12-11 06:21:43.975547103 +0000 UTC m=+0.049827994 container remove 7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.980 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4b01404f-34a2-4859-b491-e5cdd043748e]: (4, ('Thu Dec 11 06:21:43 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 (7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249)\n7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249\nThu Dec 11 06:21:43 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 (7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249)\n7b8b2ebf8b198e1e2c2bd9c848d13268ee176da7ca880d19102b18055616d249\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.983 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5da534f1-1cf9-4b49-b892-b5b8b38e07e8]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:43.984 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap0f33fc2f-00, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.986 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 kernel: tap0f33fc2f-00: left promiscuous mode
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.997 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:43 np0005554845 nova_compute[187128]: 2025-12-11 06:21:43.998 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.000 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[06dfca06-38b6-4379-a4b3-46e9f8270c70]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.025 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[7ad6c8da-6894-4ff3-8e8a-a372d4bbd459]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.027 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b133a703-e896-4d89-b044-dfbeda61c7cb]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.041 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[cab264ef-ce4b-4608-b652-4e8b01893bb6]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 435473, 'reachable_time': 19501, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225515, 'error': None, 'target': 'ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.044 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-0f33fc2f-0e53-4d9a-bdd8-75ef4128a569 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:21:44 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:44.044 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[cd87db74-9853-4104-86d5-3854e972eaee]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:21:44 np0005554845 systemd[1]: run-netns-ovnmeta\x2d0f33fc2f\x2d0e53\x2d4d9a\x2dbdd8\x2d75ef4128a569.mount: Deactivated successfully.
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.369 187132 DEBUG nova.network.neutron [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.412 187132 INFO nova.compute.manager [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Took 0.89 seconds to deallocate network for instance.#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.464 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.464 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.520 187132 DEBUG nova.compute.provider_tree [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.549 187132 DEBUG nova.scheduler.client.report [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.578 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.114s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.618 187132 INFO nova.scheduler.client.report [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Deleted allocations for instance 353ce6a3-8cb3-4e0c-962f-ee42e8483664#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.704 187132 DEBUG oslo_concurrency.lockutils [None req-638863ae-0ce2-4720-9066-701f00e8b2bb 60e9372de4754580913a836e11b9c248 79a211a6fc3c4f68b6c3d0ba433964d3 - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.523s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.745 187132 DEBUG nova.network.neutron [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updated VIF entry in instance network info cache for port e1cbb3cc-b72b-471e-9234-3eed622db912. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.746 187132 DEBUG nova.network.neutron [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Updating instance_info_cache with network_info: [{"id": "e1cbb3cc-b72b-471e-9234-3eed622db912", "address": "fa:16:3e:58:2e:2d", "network": {"id": "0f33fc2f-0e53-4d9a-bdd8-75ef4128a569", "bridge": "br-int", "label": "tempest-network-smoke--1767723275", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.13", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}, {"cidr": "2001:db8::/64", "dns": [], "gateway": {"address": "2001:db8::", "type": "gateway", "version": 6, "meta": {}}, "ips": [{"address": "2001:db8::f816:3eff:fe58:2e2d", "type": "fixed", "version": 6, "meta": {}, "floating_ips": []}], "routes": [], "version": 6, "meta": {"enable_dhcp": true, "ipv6_address_mode": "slaac"}}], "meta": {"injected": false, "tenant_id": "79a211a6fc3c4f68b6c3d0ba433964d3", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tape1cbb3cc-b7", "ovs_interfaceid": "e1cbb3cc-b72b-471e-9234-3eed622db912", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:44 np0005554845 nova_compute[187128]: 2025-12-11 06:21:44.772 187132 DEBUG oslo_concurrency.lockutils [req-9b814d02-bf68-4ca3-81a5-bf9ccb029611 req-f3450cac-f5c9-4012-92ab-915b8e013d86 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-353ce6a3-8cb3-4e0c-962f-ee42e8483664" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.214 187132 DEBUG nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-vif-unplugged-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.215 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.216 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.216 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.217 187132 DEBUG nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] No waiting events found dispatching network-vif-unplugged-e1cbb3cc-b72b-471e-9234-3eed622db912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.217 187132 WARNING nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received unexpected event network-vif-unplugged-e1cbb3cc-b72b-471e-9234-3eed622db912 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.218 187132 DEBUG nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.218 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.219 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.219 187132 DEBUG oslo_concurrency.lockutils [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "353ce6a3-8cb3-4e0c-962f-ee42e8483664-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.220 187132 DEBUG nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] No waiting events found dispatching network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.220 187132 WARNING nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received unexpected event network-vif-plugged-e1cbb3cc-b72b-471e-9234-3eed622db912 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:21:45 np0005554845 nova_compute[187128]: 2025-12-11 06:21:45.220 187132 DEBUG nova.compute.manager [req-d314a189-1b44-48d9-b1cc-7515c347886f req-e846ff6e-ebd1-4ae3-9887-10046b63fa0c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Received event network-vif-deleted-e1cbb3cc-b72b-471e-9234-3eed622db912 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:46 np0005554845 nova_compute[187128]: 2025-12-11 06:21:46.208 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:48 np0005554845 podman[225516]: 2025-12-11 06:21:48.12765215 +0000 UTC m=+0.059596190 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:21:48 np0005554845 podman[225517]: 2025-12-11 06:21:48.165076615 +0000 UTC m=+0.092199194 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, release=1755695350, version=9.6, com.redhat.component=ubi9-minimal-container, maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, vendor=Red Hat, Inc., managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, distribution-scope=public)
Dec 11 01:21:48 np0005554845 nova_compute[187128]: 2025-12-11 06:21:48.494 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:51 np0005554845 nova_compute[187128]: 2025-12-11 06:21:51.209 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.496 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.668 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.668 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.689 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.766 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.767 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.777 187132 INFO nova.compute.claims [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.884 187132 DEBUG nova.compute.provider_tree [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.906 187132 DEBUG nova.scheduler.client.report [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.938 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.171s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.939 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.990 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:21:53 np0005554845 nova_compute[187128]: 2025-12-11 06:21:53.990 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.008 187132 INFO nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.026 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.128 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.130 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.130 187132 INFO nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Creating image(s)#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.131 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.131 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.132 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.145 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.210 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.065s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.211 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.212 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.224 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.292 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.068s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.294 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.331 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk 1073741824" returned: 0 in 0.038s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.333 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.121s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.333 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.396 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.062s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.397 187132 DEBUG nova.virt.disk.api [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Checking if we can resize image /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.398 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.457 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.458 187132 DEBUG nova.virt.disk.api [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Cannot resize image /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.459 187132 DEBUG nova.objects.instance [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'migration_context' on Instance uuid 0cbb99be-6747-44eb-887b-7b96fd8f5780 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.520 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.521 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Ensure instance console log exists: /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.521 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.522 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.522 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:54 np0005554845 nova_compute[187128]: 2025-12-11 06:21:54.945 187132 DEBUG nova.policy [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:21:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:55.143 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=20, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=19) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:21:55 np0005554845 nova_compute[187128]: 2025-12-11 06:21:55.144 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:55 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:55.144 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:21:55 np0005554845 nova_compute[187128]: 2025-12-11 06:21:55.693 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Successfully created port: 26eb69c4-0a27-4b9c-be6e-942db09809ed _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:21:56 np0005554845 nova_compute[187128]: 2025-12-11 06:21:56.210 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.072 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Successfully updated port: 26eb69c4-0a27-4b9c-be6e-942db09809ed _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.169 187132 DEBUG nova.compute.manager [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.169 187132 DEBUG nova.compute.manager [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing instance network info cache due to event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.170 187132 DEBUG oslo_concurrency.lockutils [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.170 187132 DEBUG oslo_concurrency.lockutils [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.170 187132 DEBUG nova.network.neutron [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing network info cache for port 26eb69c4-0a27-4b9c-be6e-942db09809ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.173 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:21:57 np0005554845 nova_compute[187128]: 2025-12-11 06:21:57.866 187132 DEBUG nova.network.neutron [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.072 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:21:58.146 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '20'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.182 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.207 187132 DEBUG nova.network.neutron [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.226 187132 DEBUG oslo_concurrency.lockutils [req-52660d06-31be-4c25-b208-6d8fbb3de9b6 req-f08db99a-d33a-4159-85e9-0634d93552cd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.226 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquired lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.227 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.444 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765434103.4429502, 353ce6a3-8cb3-4e0c-962f-ee42e8483664 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.445 187132 INFO nova.compute.manager [-] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.462 187132 DEBUG nova.compute.manager [None req-b83ff630-1222-453a-b394-eb6aab440041 - - - - - -] [instance: 353ce6a3-8cb3-4e0c-962f-ee42e8483664] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.498 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:58 np0005554845 nova_compute[187128]: 2025-12-11 06:21:58.863 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.734 187132 DEBUG nova.network.neutron [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updating instance_info_cache with network_info: [{"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.757 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Releasing lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.758 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance network_info: |[{"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.760 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Start _get_guest_xml network_info=[{"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.766 187132 WARNING nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.770 187132 DEBUG nova.virt.libvirt.host [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.771 187132 DEBUG nova.virt.libvirt.host [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.773 187132 DEBUG nova.virt.libvirt.host [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.774 187132 DEBUG nova.virt.libvirt.host [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.775 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.775 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.775 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.776 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.777 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.777 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.777 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.777 187132 DEBUG nova.virt.hardware [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.780 187132 DEBUG nova.virt.libvirt.vif [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:21:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=52,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUz5IvvjDwpHmHFD+orw8ijcCg0K2OjkrWrFrg2yfYVF0H/4L3w+fOGJrdF/TD0MBWI2TYTY3cn3BSGdGEqLD630J7pJz50QUHTJr6VmH2nVS35zTlVO9F1/aMKjOh5tQ==',key_name='tempest-TestSecurityGroupsBasicOps-881694630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-ktcgbhu4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:21:54Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=0cbb99be-6747-44eb-887b-7b96fd8f5780,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.780 187132 DEBUG nova.network.os_vif_util [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.781 187132 DEBUG nova.network.os_vif_util [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.782 187132 DEBUG nova.objects.instance [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'pci_devices' on Instance uuid 0cbb99be-6747-44eb-887b-7b96fd8f5780 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.798 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <uuid>0cbb99be-6747-44eb-887b-7b96fd8f5780</uuid>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <name>instance-00000034</name>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626</nova:name>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:21:59</nova:creationTime>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:user uuid="78548cbaea0e406ebb716882c382c954">tempest-TestSecurityGroupsBasicOps-2036320412-project-member</nova:user>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:project uuid="9d8630abd3cd4aef89d0b1af6e62ac93">tempest-TestSecurityGroupsBasicOps-2036320412</nova:project>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        <nova:port uuid="26eb69c4-0a27-4b9c-be6e-942db09809ed">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.4" ipVersion="4"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="serial">0cbb99be-6747-44eb-887b-7b96fd8f5780</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="uuid">0cbb99be-6747-44eb-887b-7b96fd8f5780</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.config"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:79:33:3b"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <target dev="tap26eb69c4-0a"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/console.log" append="off"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:21:59 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:21:59 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:21:59 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:21:59 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.800 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Preparing to wait for external event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.800 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.800 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.801 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.802 187132 DEBUG nova.virt.libvirt.vif [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:21:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=52,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUz5IvvjDwpHmHFD+orw8ijcCg0K2OjkrWrFrg2yfYVF0H/4L3w+fOGJrdF/TD0MBWI2TYTY3cn3BSGdGEqLD630J7pJz50QUHTJr6VmH2nVS35zTlVO9F1/aMKjOh5tQ==',key_name='tempest-TestSecurityGroupsBasicOps-881694630',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-ktcgbhu4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:21:54Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=0cbb99be-6747-44eb-887b-7b96fd8f5780,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.802 187132 DEBUG nova.network.os_vif_util [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.803 187132 DEBUG nova.network.os_vif_util [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.803 187132 DEBUG os_vif [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.804 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.804 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.804 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.807 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.807 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap26eb69c4-0a, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.808 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap26eb69c4-0a, col_values=(('external_ids', {'iface-id': '26eb69c4-0a27-4b9c-be6e-942db09809ed', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:79:33:3b', 'vm-uuid': '0cbb99be-6747-44eb-887b-7b96fd8f5780'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.809 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:59 np0005554845 NetworkManager[55529]: <info>  [1765434119.8106] manager: (tap26eb69c4-0a): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/163)
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.813 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.815 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.816 187132 INFO os_vif [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a')#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.886 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.887 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.887 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No VIF found with MAC fa:16:3e:79:33:3b, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:21:59 np0005554845 nova_compute[187128]: 2025-12-11 06:21:59.887 187132 INFO nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Using config drive#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.260 187132 INFO nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Creating config drive at /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.config#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.267 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwkbd22xi execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.393 187132 DEBUG oslo_concurrency.processutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmpwkbd22xi" returned: 0 in 0.126s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:00 np0005554845 kernel: tap26eb69c4-0a: entered promiscuous mode
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.4447] manager: (tap26eb69c4-0a): new Tun device (/org/freedesktop/NetworkManager/Devices/164)
Dec 11 01:22:00 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:00Z|00318|binding|INFO|Claiming lport 26eb69c4-0a27-4b9c-be6e-942db09809ed for this chassis.
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.448 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:00Z|00319|binding|INFO|26eb69c4-0a27-4b9c-be6e-942db09809ed: Claiming fa:16:3e:79:33:3b 10.100.0.4
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.452 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.456 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.4600] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/165)
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.459 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.4608] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/166)
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.468 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:33:3b 10.100.0.4'], port_security=['fa:16:3e:79:33:3b 10.100.0.4'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0cbb99be-6747-44eb-887b-7b96fd8f5780', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '2', 'neutron:security_group_ids': '86ce5d93-12b5-417d-bee5-534e2c8ea6cb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8769db44-bfba-4d2f-9b41-01e056aa80f1, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=26eb69c4-0a27-4b9c-be6e-942db09809ed) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.469 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 26eb69c4-0a27-4b9c-be6e-942db09809ed in datapath c4ce3890-fdca-4f63-911e-50acc18a43b6 bound to our chassis#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.470 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network c4ce3890-fdca-4f63-911e-50acc18a43b6#033[00m
Dec 11 01:22:00 np0005554845 systemd-udevd[225595]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.481 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ba363cdf-4803-4df9-afd5-aa487fe1d782]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.482 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tapc4ce3890-f1 in ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.4852] device (tap26eb69c4-0a): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.4862] device (tap26eb69c4-0a): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.487 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tapc4ce3890-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.487 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b4b80472-2ef1-4ee8-8864-53fb7c8d6e88]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.488 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[fc879003-8b8a-47a4-a725-8895f874544a]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 systemd-machined[153381]: New machine qemu-25-instance-00000034.
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.501 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[57740480-6192-4a80-9d17-37a914b9ae3c]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.525 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a20bf074-3fa1-4a91-9b73-70de532fa5c2]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.526 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 systemd[1]: Started Virtual Machine qemu-25-instance-00000034.
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.533 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:00Z|00320|binding|INFO|Setting lport 26eb69c4-0a27-4b9c-be6e-942db09809ed ovn-installed in OVS
Dec 11 01:22:00 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:00Z|00321|binding|INFO|Setting lport 26eb69c4-0a27-4b9c-be6e-942db09809ed up in Southbound
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.543 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.555 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[c6cce195-2c85-4635-8316-9df44d479fef]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.5604] manager: (tapc4ce3890-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/167)
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.559 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[4b4e45f7-81e1-44ec-9bd3-71e266a87c9e]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.587 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[106b34c2-3b7f-4e7e-a652-5f084979b119]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.590 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[03b10c7f-8d8d-4bcd-9808-29a06b4def51]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.6146] device (tapc4ce3890-f0): carrier: link connected
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.619 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[97bea41c-42ce-4225-9172-fd0919bf5bc1]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.634 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[47bbac04-efca-4399-9642-3cdb0a4dc25b]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4ce3890-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:b6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439526, 'reachable_time': 25052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225629, 'error': None, 'target': 'ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.646 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a1cced62-a54f-4531-8697-7215f5db9139]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:feac:b647'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 439526, 'tstamp': 439526}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 225630, 'error': None, 'target': 'ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.659 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a60c7ee4-a69c-4e19-93c8-2523af2c115f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tapc4ce3890-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:ac:b6:47'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 97], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439526, 'reachable_time': 25052, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 225631, 'error': None, 'target': 'ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.687 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a81b7af6-9244-4753-a223-f3fb21d707ae]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.743 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[034cb442-6809-49a7-9fac-bd141574e9ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.744 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4ce3890-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.744 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.744 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tapc4ce3890-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.746 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 NetworkManager[55529]: <info>  [1765434120.7468] manager: (tapc4ce3890-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/168)
Dec 11 01:22:00 np0005554845 kernel: tapc4ce3890-f0: entered promiscuous mode
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.749 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.750 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tapc4ce3890-f0, col_values=(('external_ids', {'iface-id': 'cdc50315-02fd-4829-8636-095267b6165e'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.752 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.752 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:00 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:00Z|00322|binding|INFO|Releasing lport cdc50315-02fd-4829-8636-095267b6165e from this chassis (sb_readonly=0)
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.753 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/c4ce3890-fdca-4f63-911e-50acc18a43b6.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/c4ce3890-fdca-4f63-911e-50acc18a43b6.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.754 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ee8f1a5b-2631-4c03-87be-aeec081eb659]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.755 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-c4ce3890-fdca-4f63-911e-50acc18a43b6
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/c4ce3890-fdca-4f63-911e-50acc18a43b6.pid.haproxy
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID c4ce3890-fdca-4f63-911e-50acc18a43b6
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:22:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:00.755 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'env', 'PROCESS_TAG=haproxy-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/c4ce3890-fdca-4f63-911e-50acc18a43b6.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:22:00 np0005554845 nova_compute[187128]: 2025-12-11 06:22:00.763 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.065 187132 DEBUG nova.compute.manager [req-f0dae33d-cfdc-4eb0-b8f5-e8614e38884e req-5f92575e-f2ef-4fc6-bf4c-291df57790d8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.066 187132 DEBUG oslo_concurrency.lockutils [req-f0dae33d-cfdc-4eb0-b8f5-e8614e38884e req-5f92575e-f2ef-4fc6-bf4c-291df57790d8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.066 187132 DEBUG oslo_concurrency.lockutils [req-f0dae33d-cfdc-4eb0-b8f5-e8614e38884e req-5f92575e-f2ef-4fc6-bf4c-291df57790d8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.067 187132 DEBUG oslo_concurrency.lockutils [req-f0dae33d-cfdc-4eb0-b8f5-e8614e38884e req-5f92575e-f2ef-4fc6-bf4c-291df57790d8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.067 187132 DEBUG nova.compute.manager [req-f0dae33d-cfdc-4eb0-b8f5-e8614e38884e req-5f92575e-f2ef-4fc6-bf4c-291df57790d8 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Processing event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:22:01 np0005554845 podman[225664]: 2025-12-11 06:22:01.114975395 +0000 UTC m=+0.055271571 container create d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS)
Dec 11 01:22:01 np0005554845 systemd[1]: Started libpod-conmon-d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a.scope.
Dec 11 01:22:01 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:22:01 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/865f2bcc35d484952ac3b871508dfd314893472318d5c7ad26b82aea9ce2307f/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:22:01 np0005554845 podman[225664]: 2025-12-11 06:22:01.086855562 +0000 UTC m=+0.027151778 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:22:01 np0005554845 podman[225664]: 2025-12-11 06:22:01.201408263 +0000 UTC m=+0.141704459 container init d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:22:01 np0005554845 podman[225664]: 2025-12-11 06:22:01.208629739 +0000 UTC m=+0.148925935 container start d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true)
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.212 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:01 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [NOTICE]   (225683) : New worker (225685) forked
Dec 11 01:22:01 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [NOTICE]   (225683) : Loading success.
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.402 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance event wait completed in 0 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.403 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434121.402143, 0cbb99be-6747-44eb-887b-7b96fd8f5780 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.405 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] VM Started (Lifecycle Event)#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.409 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.413 187132 INFO nova.virt.libvirt.driver [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance spawned successfully.#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.413 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.432 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.437 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Synchronizing instance power state after lifecycle event "Started"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.447 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.448 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.449 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.449 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.450 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.450 187132 DEBUG nova.virt.libvirt.driver [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.481 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.482 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434121.4023645, 0cbb99be-6747-44eb-887b-7b96fd8f5780 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.482 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.515 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.518 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434121.407501, 0cbb99be-6747-44eb-887b-7b96fd8f5780 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.518 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.527 187132 INFO nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Took 7.40 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.528 187132 DEBUG nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.534 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.537 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.563 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.584 187132 INFO nova.compute.manager [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Took 7.84 seconds to build instance.#033[00m
Dec 11 01:22:01 np0005554845 nova_compute[187128]: 2025-12-11 06:22:01.597 187132 DEBUG oslo_concurrency.lockutils [None req-b10b1c98-240e-405f-89fd-3763937719fe 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 7.929s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:02 np0005554845 podman[225701]: 2025-12-11 06:22:02.647025846 +0000 UTC m=+0.058179560 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.267 187132 DEBUG nova.compute.manager [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.268 187132 DEBUG oslo_concurrency.lockutils [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.268 187132 DEBUG oslo_concurrency.lockutils [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.269 187132 DEBUG oslo_concurrency.lockutils [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.269 187132 DEBUG nova.compute.manager [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] No waiting events found dispatching network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:22:03 np0005554845 nova_compute[187128]: 2025-12-11 06:22:03.269 187132 WARNING nova.compute.manager [req-b3482c36-a5c2-4d81-b9a2-4a02864f0ca8 req-00038bad-ca69-426f-8b00-b9484d2439d7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received unexpected event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed for instance with vm_state active and task_state None.#033[00m
Dec 11 01:22:04 np0005554845 nova_compute[187128]: 2025-12-11 06:22:04.811 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.394 187132 DEBUG nova.compute.manager [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.394 187132 DEBUG nova.compute.manager [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing instance network info cache due to event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.395 187132 DEBUG oslo_concurrency.lockutils [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.395 187132 DEBUG oslo_concurrency.lockutils [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.395 187132 DEBUG nova.network.neutron [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing network info cache for port 26eb69c4-0a27-4b9c-be6e-942db09809ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.417 187132 DEBUG nova.compute.manager [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.418 187132 DEBUG nova.compute.manager [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing instance network info cache due to event network-changed-26eb69c4-0a27-4b9c-be6e-942db09809ed. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:22:05 np0005554845 nova_compute[187128]: 2025-12-11 06:22:05.418 187132 DEBUG oslo_concurrency.lockutils [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:22:06 np0005554845 nova_compute[187128]: 2025-12-11 06:22:06.214 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:07 np0005554845 nova_compute[187128]: 2025-12-11 06:22:07.771 187132 DEBUG nova.network.neutron [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updated VIF entry in instance network info cache for port 26eb69c4-0a27-4b9c-be6e-942db09809ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:22:07 np0005554845 nova_compute[187128]: 2025-12-11 06:22:07.772 187132 DEBUG nova.network.neutron [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updating instance_info_cache with network_info: [{"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:22:07 np0005554845 nova_compute[187128]: 2025-12-11 06:22:07.798 187132 DEBUG oslo_concurrency.lockutils [req-1328292a-a695-49ed-9e0f-0f047961edb3 req-534bbaa0-0e3e-4fed-b569-2889312abccd eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:22:07 np0005554845 nova_compute[187128]: 2025-12-11 06:22:07.799 187132 DEBUG oslo_concurrency.lockutils [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:22:07 np0005554845 nova_compute[187128]: 2025-12-11 06:22:07.799 187132 DEBUG nova.network.neutron [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Refreshing network info cache for port 26eb69c4-0a27-4b9c-be6e-942db09809ed _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:22:08 np0005554845 podman[225724]: 2025-12-11 06:22:08.171279371 +0000 UTC m=+0.102463413 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:22:09 np0005554845 nova_compute[187128]: 2025-12-11 06:22:09.817 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:09 np0005554845 nova_compute[187128]: 2025-12-11 06:22:09.874 187132 DEBUG nova.network.neutron [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updated VIF entry in instance network info cache for port 26eb69c4-0a27-4b9c-be6e-942db09809ed. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:22:09 np0005554845 nova_compute[187128]: 2025-12-11 06:22:09.874 187132 DEBUG nova.network.neutron [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updating instance_info_cache with network_info: [{"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:22:09 np0005554845 nova_compute[187128]: 2025-12-11 06:22:09.896 187132 DEBUG oslo_concurrency.lockutils [req-a085c91a-b139-4432-8d91-a83a0ae1d174 req-544c5cc0-d5ec-4e86-bbce-3ecfdfff42ee eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-0cbb99be-6747-44eb-887b-7b96fd8f5780" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:22:10 np0005554845 podman[225744]: 2025-12-11 06:22:10.122000421 +0000 UTC m=+0.055125798 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:22:10 np0005554845 podman[225745]: 2025-12-11 06:22:10.199229468 +0000 UTC m=+0.125650682 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:22:11 np0005554845 nova_compute[187128]: 2025-12-11 06:22:11.216 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:14 np0005554845 podman[225805]: 2025-12-11 06:22:14.146930214 +0000 UTC m=+0.060362670 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:22:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:14Z|00046|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:79:33:3b 10.100.0.4
Dec 11 01:22:14 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:14Z|00047|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:79:33:3b 10.100.0.4
Dec 11 01:22:14 np0005554845 nova_compute[187128]: 2025-12-11 06:22:14.821 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:16 np0005554845 nova_compute[187128]: 2025-12-11 06:22:16.218 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:19 np0005554845 podman[225827]: 2025-12-11 06:22:19.140588751 +0000 UTC m=+0.066524057 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., architecture=x86_64, name=ubi9-minimal, build-date=2025-08-20T13:12:41, release=1755695350, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, distribution-scope=public, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, container_name=openstack_network_exporter)
Dec 11 01:22:19 np0005554845 podman[225826]: 2025-12-11 06:22:19.159550746 +0000 UTC m=+0.082256545 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:22:19 np0005554845 nova_compute[187128]: 2025-12-11 06:22:19.825 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.434 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.435 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.435 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.435 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.436 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.437 187132 INFO nova.compute.manager [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Terminating instance#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.438 187132 DEBUG nova.compute.manager [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:22:20 np0005554845 kernel: tap26eb69c4-0a (unregistering): left promiscuous mode
Dec 11 01:22:20 np0005554845 NetworkManager[55529]: <info>  [1765434140.4714] device (tap26eb69c4-0a): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:22:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:20Z|00323|binding|INFO|Releasing lport 26eb69c4-0a27-4b9c-be6e-942db09809ed from this chassis (sb_readonly=0)
Dec 11 01:22:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:20Z|00324|binding|INFO|Setting lport 26eb69c4-0a27-4b9c-be6e-942db09809ed down in Southbound
Dec 11 01:22:20 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:20Z|00325|binding|INFO|Removing iface tap26eb69c4-0a ovn-installed in OVS
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.483 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.499 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.504 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:79:33:3b 10.100.0.4', 'unknown'], port_security=[], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.4/28', 'neutron:device_id': '0cbb99be-6747-44eb-887b-7b96fd8f5780', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '6', 'neutron:security_group_ids': '', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=8769db44-bfba-4d2f-9b41-01e056aa80f1, chassis=[], tunnel_key=4, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=26eb69c4-0a27-4b9c-be6e-942db09809ed) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.506 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 26eb69c4-0a27-4b9c-be6e-942db09809ed in datapath c4ce3890-fdca-4f63-911e-50acc18a43b6 unbound from our chassis#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.509 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network c4ce3890-fdca-4f63-911e-50acc18a43b6, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.511 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[87152fc7-a24e-4f8d-8b63-d19aedc368e1]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.512 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6 namespace which is not needed anymore#033[00m
Dec 11 01:22:20 np0005554845 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Deactivated successfully.
Dec 11 01:22:20 np0005554845 systemd[1]: machine-qemu\x2d25\x2dinstance\x2d00000034.scope: Consumed 14.452s CPU time.
Dec 11 01:22:20 np0005554845 systemd-machined[153381]: Machine qemu-25-instance-00000034 terminated.
Dec 11 01:22:20 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [NOTICE]   (225683) : haproxy version is 2.8.14-c23fe91
Dec 11 01:22:20 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [NOTICE]   (225683) : path to executable is /usr/sbin/haproxy
Dec 11 01:22:20 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [WARNING]  (225683) : Exiting Master process...
Dec 11 01:22:20 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [ALERT]    (225683) : Current worker (225685) exited with code 143 (Terminated)
Dec 11 01:22:20 np0005554845 neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6[225679]: [WARNING]  (225683) : All workers exited. Exiting... (0)
Dec 11 01:22:20 np0005554845 systemd[1]: libpod-d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a.scope: Deactivated successfully.
Dec 11 01:22:20 np0005554845 podman[225895]: 2025-12-11 06:22:20.651474267 +0000 UTC m=+0.045529347 container died d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.657 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.663 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a-userdata-shm.mount: Deactivated successfully.
Dec 11 01:22:20 np0005554845 systemd[1]: var-lib-containers-storage-overlay-865f2bcc35d484952ac3b871508dfd314893472318d5c7ad26b82aea9ce2307f-merged.mount: Deactivated successfully.
Dec 11 01:22:20 np0005554845 podman[225895]: 2025-12-11 06:22:20.69944725 +0000 UTC m=+0.093502330 container cleanup d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:22:20 np0005554845 systemd[1]: libpod-conmon-d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a.scope: Deactivated successfully.
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.712 187132 INFO nova.virt.libvirt.driver [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Instance destroyed successfully.#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.713 187132 DEBUG nova.objects.instance [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'resources' on Instance uuid 0cbb99be-6747-44eb-887b-7b96fd8f5780 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.767 187132 DEBUG nova.virt.libvirt.vif [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:21:52Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-gen-1-1337522626',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ge',id=52,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBPUz5IvvjDwpHmHFD+orw8ijcCg0K2OjkrWrFrg2yfYVF0H/4L3w+fOGJrdF/TD0MBWI2TYTY3cn3BSGdGEqLD630J7pJz50QUHTJr6VmH2nVS35zTlVO9F1/aMKjOh5tQ==',key_name='tempest-TestSecurityGroupsBasicOps-881694630',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:22:01Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-ktcgbhu4',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:22:01Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=0cbb99be-6747-44eb-887b-7b96fd8f5780,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.768 187132 DEBUG nova.network.os_vif_util [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "address": "fa:16:3e:79:33:3b", "network": {"id": "c4ce3890-fdca-4f63-911e-50acc18a43b6", "bridge": "br-int", "label": "tempest-network-smoke--1939158281", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.4", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap26eb69c4-0a", "ovs_interfaceid": "26eb69c4-0a27-4b9c-be6e-942db09809ed", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.770 187132 DEBUG nova.network.os_vif_util [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:22:20 np0005554845 podman[225941]: 2025-12-11 06:22:20.77089658 +0000 UTC m=+0.048917349 container remove d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.771 187132 DEBUG os_vif [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.773 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.774 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap26eb69c4-0a, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.775 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.777 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.776 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[61a8c966-96e5-4e38-b516-21e05fde3c5f]: (4, ('Thu Dec 11 06:22:20 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6 (d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a)\nd24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a\nThu Dec 11 06:22:20 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6 (d24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a)\nd24b7a587a5a0523a0ebb8927effe146ad076c92830c769903dbdaf457a09b0a\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.779 187132 INFO os_vif [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:79:33:3b,bridge_name='br-int',has_traffic_filtering=True,id=26eb69c4-0a27-4b9c-be6e-942db09809ed,network=Network(c4ce3890-fdca-4f63-911e-50acc18a43b6),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap26eb69c4-0a')#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.779 187132 INFO nova.virt.libvirt.driver [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Deleting instance files /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780_del#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.779 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3259c85f-8f82-4af3-bc4d-9af45ec9be80]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.780 187132 INFO nova.virt.libvirt.driver [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Deletion of /var/lib/nova/instances/0cbb99be-6747-44eb-887b-7b96fd8f5780_del complete#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.781 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tapc4ce3890-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.782 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 kernel: tapc4ce3890-f0: left promiscuous mode
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.795 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.796 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a3e8637a-baee-4232-bad7-8f4b06e55b4c]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.812 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[9ca077a2-1d77-49e9-872a-eedca259bd6b]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.814 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[a16b8ded-3267-4456-9d67-2be1f00bcf72]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.831 187132 INFO nova.compute.manager [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Took 0.39 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.832 187132 DEBUG oslo.service.loopingcall [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.833 187132 DEBUG nova.compute.manager [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:22:20 np0005554845 nova_compute[187128]: 2025-12-11 06:22:20.833 187132 DEBUG nova.network.neutron [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.839 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ac470aba-d532-4c0f-a3ba-4a11887149ac]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 439519, 'reachable_time': 34912, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 225956, 'error': None, 'target': 'ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.842 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-c4ce3890-fdca-4f63-911e-50acc18a43b6 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:22:20 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:20.842 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[549fcd2b-f6dc-414f-9df8-c8a4ee934b3a]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:20 np0005554845 systemd[1]: run-netns-ovnmeta\x2dc4ce3890\x2dfdca\x2d4f63\x2d911e\x2d50acc18a43b6.mount: Deactivated successfully.
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.221 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.591 187132 DEBUG nova.network.neutron [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.609 187132 INFO nova.compute.manager [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Took 0.78 seconds to deallocate network for instance.#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.651 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.651 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.711 187132 DEBUG nova.compute.provider_tree [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.764 187132 DEBUG nova.scheduler.client.report [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.797 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.146s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.830 187132 INFO nova.scheduler.client.report [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Deleted allocations for instance 0cbb99be-6747-44eb-887b-7b96fd8f5780#033[00m
Dec 11 01:22:21 np0005554845 nova_compute[187128]: 2025-12-11 06:22:21.906 187132 DEBUG oslo_concurrency.lockutils [None req-6e19c097-b2e5-462f-b3ab-e2ab03f6d9e6 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 1.471s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.209 187132 DEBUG nova.compute.manager [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.210 187132 DEBUG oslo_concurrency.lockutils [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.210 187132 DEBUG oslo_concurrency.lockutils [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.211 187132 DEBUG oslo_concurrency.lockutils [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "0cbb99be-6747-44eb-887b-7b96fd8f5780-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.211 187132 DEBUG nova.compute.manager [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] No waiting events found dispatching network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.212 187132 WARNING nova.compute.manager [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received unexpected event network-vif-plugged-26eb69c4-0a27-4b9c-be6e-942db09809ed for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:22:23 np0005554845 nova_compute[187128]: 2025-12-11 06:22:23.212 187132 DEBUG nova.compute.manager [req-3a6dc85c-a2b7-4e67-8707-52389d4b20d3 req-b74dcc53-ab5c-48b6-b7f2-25d46b7ab9c2 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Received event network-vif-deleted-26eb69c4-0a27-4b9c-be6e-942db09809ed external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:25 np0005554845 nova_compute[187128]: 2025-12-11 06:22:25.812 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:26 np0005554845 nova_compute[187128]: 2025-12-11 06:22:26.222 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:26.234 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:26.235 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:26.235 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:28 np0005554845 nova_compute[187128]: 2025-12-11 06:22:28.085 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:28 np0005554845 nova_compute[187128]: 2025-12-11 06:22:28.133 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:30 np0005554845 nova_compute[187128]: 2025-12-11 06:22:30.814 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:31 np0005554845 nova_compute[187128]: 2025-12-11 06:22:31.223 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:33 np0005554845 podman[225959]: 2025-12-11 06:22:33.124181432 +0000 UTC m=+0.061079340 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:22:35 np0005554845 nova_compute[187128]: 2025-12-11 06:22:35.711 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765434140.710208, 0cbb99be-6747-44eb-887b-7b96fd8f5780 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:35 np0005554845 nova_compute[187128]: 2025-12-11 06:22:35.712 187132 INFO nova.compute.manager [-] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:22:35 np0005554845 nova_compute[187128]: 2025-12-11 06:22:35.757 187132 DEBUG nova.compute.manager [None req-de5fc7bd-078d-4858-9ad9-c484e5259c3b - - - - - -] [instance: 0cbb99be-6747-44eb-887b-7b96fd8f5780] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:35 np0005554845 nova_compute[187128]: 2025-12-11 06:22:35.863 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:36 np0005554845 nova_compute[187128]: 2025-12-11 06:22:36.224 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:36 np0005554845 nova_compute[187128]: 2025-12-11 06:22:36.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.722 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.723 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.913 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.915 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5704MB free_disk=73.29178237915039GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.915 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:37 np0005554845 nova_compute[187128]: 2025-12-11 06:22:37.915 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.033 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.034 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.053 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.069 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.088 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:22:38 np0005554845 nova_compute[187128]: 2025-12-11 06:22:38.088 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.173s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.083 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:39 np0005554845 podman[225985]: 2025-12-11 06:22:39.148827794 +0000 UTC m=+0.082093849 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS)
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.708 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.708 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.708 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.709 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:39 np0005554845 nova_compute[187128]: 2025-12-11 06:22:39.709 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:22:40 np0005554845 nova_compute[187128]: 2025-12-11 06:22:40.704 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:40 np0005554845 nova_compute[187128]: 2025-12-11 06:22:40.865 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:41 np0005554845 podman[226005]: 2025-12-11 06:22:41.146632713 +0000 UTC m=+0.070099144 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 11 01:22:41 np0005554845 podman[226006]: 2025-12-11 06:22:41.181587492 +0000 UTC m=+0.101220459 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, org.label-schema.schema-version=1.0, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.vendor=CentOS)
Dec 11 01:22:41 np0005554845 nova_compute[187128]: 2025-12-11 06:22:41.225 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:41 np0005554845 nova_compute[187128]: 2025-12-11 06:22:41.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:41 np0005554845 nova_compute[187128]: 2025-12-11 06:22:41.967 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:41 np0005554845 nova_compute[187128]: 2025-12-11 06:22:41.967 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227" acquired by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:41 np0005554845 nova_compute[187128]: 2025-12-11 06:22:41.992 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Starting instance... _do_build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2402#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.157 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.158 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.165 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Require both a host and instance NUMA topology to fit instance on host. numa_fit_instance_to_host /usr/lib/python3.9/site-packages/nova/virt/hardware.py:2368#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.165 187132 INFO nova.compute.claims [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Claim successful on node compute-2.ctlplane.example.com#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.338 187132 DEBUG nova.compute.provider_tree [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.355 187132 DEBUG nova.scheduler.client.report [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.412 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.instance_claim" :: held 0.255s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.413 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Start building networks asynchronously for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2799#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.472 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Allocating IP information in the background. _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1952#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.472 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] allocate_for_instance() allocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1156#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.498 187132 INFO nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Ignoring supplied device name: /dev/vda. Libvirt can't honour user-supplied dev names#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.520 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Start building block device mappings for instance. _build_resources /usr/lib/python3.9/site-packages/nova/compute/manager.py:2834#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.745 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Start spawning the instance on the hypervisor. _build_and_run_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:2608#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.747 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Creating instance directory _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4723#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.747 187132 INFO nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Creating image(s)#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.748 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.info" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.748 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.info" acquired by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.749 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.info" "released" by "nova.virt.libvirt.imagebackend.Image.resolve_driver_format.<locals>.write_to_disk_info_file" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.760 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.827 187132 DEBUG nova.policy [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Policy check for network:attach_external_network failed with credentials {'is_admin': False, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_domain_id': 'default', 'system_scope': None, 'domain_id': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_domain_id': 'default', 'roles': ['reader', 'member'], 'is_admin_project': True, 'service_user_id': None, 'service_user_domain_id': None, 'service_project_id': None, 'service_project_domain_id': None, 'service_roles': []} authorize /usr/lib/python3.9/site-packages/nova/policy.py:203#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.830 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.069s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.830 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.831 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" acquired by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.842 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.931 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.088s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.932 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk 1073741824 execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.981 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "env LC_ALL=C LANG=C qemu-img create -f qcow2 -o backing_file=/var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165,backing_fmt=raw /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk 1073741824" returned: 0 in 0.048s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.982 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "36b9b148fe5f26e0eb983e4b99988b0d87b51165" "released" by "nova.virt.libvirt.imagebackend.Qcow2.create_image.<locals>.create_qcow2_image" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:42 np0005554845 nova_compute[187128]: 2025-12-11 06:22:42.983 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.041 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/_base/36b9b148fe5f26e0eb983e4b99988b0d87b51165 --force-share --output=json" returned: 0 in 0.058s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.042 187132 DEBUG nova.virt.disk.api [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Checking if we can resize image /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk. size=1073741824 can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:166#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.043 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.103 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json" returned: 0 in 0.061s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.104 187132 DEBUG nova.virt.disk.api [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Cannot resize image /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk to a smaller size. can_resize_image /usr/lib/python3.9/site-packages/nova/virt/disk/api.py:172#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.104 187132 DEBUG nova.objects.instance [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'migration_context' on Instance uuid 7030a3a1-028c-4af8-a8bf-008a08a52227 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.124 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Created local disks _create_image /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4857#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.124 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Ensure instance console log exists: /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/console.log _ensure_console_log_for_instance /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4609#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.125 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "vgpu_resources" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.125 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" acquired by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.125 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "vgpu_resources" "released" by "nova.virt.libvirt.driver.LibvirtDriver._allocate_mdevs" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:43 np0005554845 nova_compute[187128]: 2025-12-11 06:22:43.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:22:45 np0005554845 podman[226064]: 2025-12-11 06:22:45.12716882 +0000 UTC m=+0.056814253 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:22:45 np0005554845 nova_compute[187128]: 2025-12-11 06:22:45.294 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Successfully created port: 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _create_port_minimal /usr/lib/python3.9/site-packages/nova/network/neutron.py:548#033[00m
Dec 11 01:22:45 np0005554845 nova_compute[187128]: 2025-12-11 06:22:45.867 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.227 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.722 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Successfully updated port: 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _update_port /usr/lib/python3.9/site-packages/nova/network/neutron.py:586#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.742 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.743 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquired lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.743 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Building network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2010#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.857 187132 DEBUG nova.compute.manager [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.857 187132 DEBUG nova.compute.manager [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing instance network info cache due to event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.858 187132 DEBUG oslo_concurrency.lockutils [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:22:46 np0005554845 nova_compute[187128]: 2025-12-11 06:22:46.954 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Instance cache missing network info. _get_preexisting_port_ids /usr/lib/python3.9/site-packages/nova/network/neutron.py:3323#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.068 187132 DEBUG nova.network.neutron [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.095 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Releasing lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.095 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Instance network_info: |[{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}]| _allocate_network_async /usr/lib/python3.9/site-packages/nova/compute/manager.py:1967#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.095 187132 DEBUG oslo_concurrency.lockutils [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.096 187132 DEBUG nova.network.neutron [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.098 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Start _get_guest_xml network_info=[{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] disk_info={'disk_bus': 'virtio', 'cdrom_bus': 'sata', 'mapping': {'root': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk': {'bus': 'virtio', 'dev': 'vda', 'type': 'disk', 'boot_index': '1'}, 'disk.config': {'bus': 'sata', 'dev': 'sda', 'type': 'cdrom'}}} image_meta=ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>) rescue=None block_device_info={'root_device_name': '/dev/vda', 'image': [{'boot_index': 0, 'device_name': '/dev/vda', 'size': 0, 'encryption_secret_uuid': None, 'encryption_format': None, 'encrypted': False, 'guest_format': None, 'disk_bus': 'virtio', 'device_type': 'disk', 'encryption_options': None, 'image_id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}], 'ephemerals': [], 'block_device_mapping': [], 'swap': None} _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7549#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.102 187132 WARNING nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.108 187132 DEBUG nova.virt.libvirt.host [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V1... _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1653#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.109 187132 DEBUG nova.virt.libvirt.host [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller missing on host. _has_cgroupsv1_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1663#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.114 187132 DEBUG nova.virt.libvirt.host [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Searching host: 'compute-2.ctlplane.example.com' for CPU controller through CGroups V2... _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1672#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.115 187132 DEBUG nova.virt.libvirt.host [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU controller found on host. _has_cgroupsv2_cpu_controller /usr/lib/python3.9/site-packages/nova/virt/libvirt/host.py:1679#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.116 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CPU mode 'custom' models 'Nehalem' was chosen, with extra flags: '' _get_guest_cpu_model_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:5396#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.116 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Getting desirable topologies for flavor Flavor(created_at=2025-12-11T06:03:51Z,deleted=False,deleted_at=None,description=None,disabled=False,ephemeral_gb=0,extra_specs={hw_rng:allowed='True'},flavorid='604ddafe-0c56-4202-93c6-01236db9ae98',id=1,is_public=True,memory_mb=128,name='m1.nano',projects=<?>,root_gb=1,rxtx_factor=1.0,swap=0,updated_at=None,vcpu_weight=0,vcpus=1) and image_meta ImageMeta(checksum='c8fc807773e5354afe61636071771906',container_format='bare',created_at=2025-12-11T06:03:53Z,direct_url=<?>,disk_format='qcow2',id=8999c077-a9de-4930-873b-81a3bd2d6c5f,min_disk=0,min_ram=0,name='cirros-0.6.2-x86_64-disk.img',owner='58891547c7294a57a183f092c2e8f0a6',properties=ImageMetaProps,protected=<?>,size=21430272,status='active',tags=<?>,updated_at=2025-12-11T06:03:55Z,virtual_size=<?>,visibility=<?>), allow threads: True _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:563#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.116 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:348#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.116 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image limits 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:352#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.117 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Flavor pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:388#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.117 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Image pref 0:0:0 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:392#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.117 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Chose sockets=0, cores=0, threads=0; limits were sockets=65536, cores=65536, threads=65536 get_cpu_topology_constraints /usr/lib/python3.9/site-packages/nova/virt/hardware.py:430#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.117 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Topology preferred VirtCPUTopology(cores=0,sockets=0,threads=0), maximum VirtCPUTopology(cores=65536,sockets=65536,threads=65536) _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:569#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.117 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Build topologies for 1 vcpu(s) 1:1:1 _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:471#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.118 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Got 1 possible topologies _get_possible_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:501#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.118 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Possible topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:575#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.118 187132 DEBUG nova.virt.hardware [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Sorted desired topologies [VirtCPUTopology(cores=1,sockets=1,threads=1)] _get_desirable_cpu_topologies /usr/lib/python3.9/site-packages/nova/virt/hardware.py:577#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.122 187132 DEBUG nova.virt.libvirt.vif [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:22:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ac',id=54,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCcBU2AiCiG43plJXeuCC5swRewcLeFXgxthvl4QIA/JXvysQvegVCEFCqn/SWbo/hhdJCKug/ac2zzwPBAlkyrvklynj1zrxLiffMjoakg13WIhbQT3eDWM+5WvMrCymQ==',key_name='tempest-TestSecurityGroupsBasicOps-1772198906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-e2124y2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:22:42Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=7030a3a1-028c-4af8-a8bf-008a08a52227,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} virt_type=kvm get_config /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:563#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.123 187132 DEBUG nova.network.os_vif_util [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:22:50 np0005554845 podman[226084]: 2025-12-11 06:22:50.123662086 +0000 UTC m=+0.053895765 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.124 187132 DEBUG nova.network.os_vif_util [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.125 187132 DEBUG nova.objects.instance [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'pci_devices' on Instance uuid 7030a3a1-028c-4af8-a8bf-008a08a52227 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:22:50 np0005554845 podman[226085]: 2025-12-11 06:22:50.136446623 +0000 UTC m=+0.067431792 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, io.openshift.expose-services=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, name=ubi9-minimal, vcs-type=git, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, architecture=x86_64, io.buildah.version=1.33.7, config_id=edpm, version=9.6)
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.146 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] End _get_guest_xml xml=<domain type="kvm">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <uuid>7030a3a1-028c-4af8-a8bf-008a08a52227</uuid>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <name>instance-00000036</name>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <memory>131072</memory>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <vcpu>1</vcpu>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <metadata>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <nova:instance xmlns:nova="http://openstack.org/xmlns/libvirt/nova/1.1">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:package version="27.5.2-0.20250829104910.6f8decf.el9"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:name>tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649</nova:name>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:creationTime>2025-12-11 06:22:50</nova:creationTime>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:flavor name="m1.nano">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:memory>128</nova:memory>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:disk>1</nova:disk>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:swap>0</nova:swap>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:ephemeral>0</nova:ephemeral>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:vcpus>1</nova:vcpus>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      </nova:flavor>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:owner>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:user uuid="78548cbaea0e406ebb716882c382c954">tempest-TestSecurityGroupsBasicOps-2036320412-project-member</nova:user>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:project uuid="9d8630abd3cd4aef89d0b1af6e62ac93">tempest-TestSecurityGroupsBasicOps-2036320412</nova:project>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      </nova:owner>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:root type="image" uuid="8999c077-a9de-4930-873b-81a3bd2d6c5f"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <nova:ports>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        <nova:port uuid="7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:          <nova:ip type="fixed" address="10.100.0.3" ipVersion="4"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:        </nova:port>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      </nova:ports>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </nova:instance>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </metadata>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <sysinfo type="smbios">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <system>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="manufacturer">RDO</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="product">OpenStack Compute</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="version">27.5.2-0.20250829104910.6f8decf.el9</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="serial">7030a3a1-028c-4af8-a8bf-008a08a52227</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="uuid">7030a3a1-028c-4af8-a8bf-008a08a52227</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <entry name="family">Virtual Machine</entry>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </system>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </sysinfo>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <os>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <type arch="x86_64" machine="q35">hvm</type>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <boot dev="hd"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <smbios mode="sysinfo"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </os>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <features>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <acpi/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <apic/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <vmcoreinfo/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </features>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <clock offset="utc">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <timer name="pit" tickpolicy="delay"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <timer name="rtc" tickpolicy="catchup"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <timer name="hpet" present="no"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </clock>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <cpu mode="custom" match="exact">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <model>Nehalem</model>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <topology sockets="1" cores="1" threads="1"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </cpu>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  <devices>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <disk type="file" device="disk">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <driver name="qemu" type="qcow2" cache="none"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <target dev="vda" bus="virtio"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <disk type="file" device="cdrom">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <driver name="qemu" type="raw" cache="none"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <source file="/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.config"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <target dev="sda" bus="sata"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </disk>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <interface type="ethernet">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <mac address="fa:16:3e:e3:89:e4"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <driver name="vhost" rx_queue_size="512"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <mtu size="1442"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <target dev="tap7b9f5b7c-eb"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </interface>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <serial type="pty">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <log file="/var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/console.log" append="off"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </serial>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <graphics type="vnc" autoport="yes" listen="::0"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <video>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <model type="virtio"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </video>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <input type="tablet" bus="usb"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <rng model="virtio">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <backend model="random">/dev/urandom</backend>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </rng>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="pci" model="pcie-root-port"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <controller type="usb" index="0"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    <memballoon model="virtio">
Dec 11 01:22:50 np0005554845 nova_compute[187128]:      <stats period="10"/>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:    </memballoon>
Dec 11 01:22:50 np0005554845 nova_compute[187128]:  </devices>
Dec 11 01:22:50 np0005554845 nova_compute[187128]: </domain>
Dec 11 01:22:50 np0005554845 nova_compute[187128]: _get_guest_xml /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:7555#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.147 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Preparing to wait for external event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 prepare_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:283#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.147 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.148 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.148 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.prepare_for_instance_event.<locals>._create_or_get_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.149 187132 DEBUG nova.virt.libvirt.vif [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='',created_at=2025-12-11T06:22:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=None,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',ec2_ids=EC2Ids,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ac',id=54,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCcBU2AiCiG43plJXeuCC5swRewcLeFXgxthvl4QIA/JXvysQvegVCEFCqn/SWbo/hhdJCKug/ac2zzwPBAlkyrvklynj1zrxLiffMjoakg13WIhbQT3eDWM+5WvMrCymQ==',key_name='tempest-TestSecurityGroupsBasicOps-1772198906',keypairs=KeyPairList,launch_index=0,launched_at=None,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=None,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=PciDeviceList,pci_requests=InstancePCIRequests,power_state=0,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-e2124y2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_machine_type='q35',image_hw_rng_model='virtio',image_min_disk='1',image_min_ram='0',network_allocated='True',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=TagList,task_state='spawning',terminated_at=None,trusted_certs=None,updated_at=2025-12-11T06:22:42Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=7030a3a1-028c-4af8-a8bf-008a08a52227,vcpu_model=VirtCPUModel,vcpus=1,vm_mode=None,vm_state='building') vif={"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} plug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:710#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.149 187132 DEBUG nova.network.os_vif_util [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.150 187132 DEBUG nova.network.os_vif_util [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=False,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.150 187132 DEBUG os_vif [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Plugging vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb') plug /usr/lib/python3.9/site-packages/os_vif/__init__.py:76#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.150 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.151 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddBridgeCommand(_result=None, name=br-int, may_exist=True, datapath_type=system) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.151 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.156 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.156 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap7b9f5b7c-eb, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.157 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=1): DbSetCommand(_result=None, table=Interface, record=tap7b9f5b7c-eb, col_values=(('external_ids', {'iface-id': '7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6', 'iface-status': 'active', 'attached-mac': 'fa:16:3e:e3:89:e4', 'vm-uuid': '7030a3a1-028c-4af8-a8bf-008a08a52227'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.158 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:50 np0005554845 NetworkManager[55529]: <info>  [1765434170.1599] manager: (tap7b9f5b7c-eb): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/169)
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.160 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.166 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.167 187132 INFO os_vif [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully plugged vif VIFOpenVSwitch(active=False,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb')#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.237 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name vda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.237 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No BDM found with device name sda, not building metadata. _build_disk_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12116#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.237 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] No VIF found with MAC fa:16:3e:e3:89:e4, not building metadata _build_interface_metadata /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:12092#033[00m
Dec 11 01:22:50 np0005554845 nova_compute[187128]: 2025-12-11 06:22:50.238 187132 INFO nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Using config drive#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.229 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.340 187132 INFO nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Creating config drive at /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.config#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.344 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Running cmd (subprocess): /usr/bin/mkisofs -o /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5k3888qn execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.466 187132 DEBUG oslo_concurrency.processutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] CMD "/usr/bin/mkisofs -o /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk.config -ldots -allow-lowercase -allow-multidot -l -publisher OpenStack Compute 27.5.2-0.20250829104910.6f8decf.el9 -quiet -J -r -V config-2 /tmp/tmp5k3888qn" returned: 0 in 0.122s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:22:51 np0005554845 kernel: tap7b9f5b7c-eb: entered promiscuous mode
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.5260] manager: (tap7b9f5b7c-eb): new Tun device (/org/freedesktop/NetworkManager/Devices/170)
Dec 11 01:22:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:51Z|00326|binding|INFO|Claiming lport 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 for this chassis.
Dec 11 01:22:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:51Z|00327|binding|INFO|7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6: Claiming fa:16:3e:e3:89:e4 10.100.0.3
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.526 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.531 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.539 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:89:e4 10.100.0.3'], port_security=['fa:16:3e:e3:89:e4 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '2', 'neutron:security_group_ids': '40d159e8-a87c-4e2a-ad3b-023cf01f5577 c95e8f7e-138d-43e5-9624-cbbdd47e0fbb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c3b1014-bbb0-4802-b8e5-65838069075c, chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6) old=Port_Binding(chassis=[]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.540 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 in datapath 30a3c425-ff4d-4be0-b139-7bb4a9781599 bound to our chassis#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.541 104320 INFO neutron.agent.ovn.metadata.agent [-] Provisioning metadata for network 30a3c425-ff4d-4be0-b139-7bb4a9781599#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.555 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[bc9b3d2c-b496-4c58-a454-afbaffd25389]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.555 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Creating VETH tap30a3c425-f1 in ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599 namespace provision_datapath /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:665#033[00m
Dec 11 01:22:51 np0005554845 systemd-udevd[226149]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.557 213683 DEBUG neutron.privileged.agent.linux.ip_lib [-] Interface tap30a3c425-f0 not found in namespace None get_link_id /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:204#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.558 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[5dbf484d-e299-47f6-bcfb-f74724e0316e]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.558 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[97964b2c-8dd0-4094-a553-b7f02fc1c893]: (4, False) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 systemd-machined[153381]: New machine qemu-26-instance-00000036.
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.5683] device (tap7b9f5b7c-eb): state change: unmanaged -> unavailable (reason 'connection-assumed', managed-type: 'external')
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.5692] device (tap7b9f5b7c-eb): state change: unavailable -> disconnected (reason 'none', managed-type: 'external')
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.573 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[940f0ac8-779d-478c-bf8d-629beb80f426]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.586 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 systemd[1]: Started Virtual Machine qemu-26-instance-00000036.
Dec 11 01:22:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:51Z|00328|binding|INFO|Setting lport 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 ovn-installed in OVS
Dec 11 01:22:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:51Z|00329|binding|INFO|Setting lport 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 up in Southbound
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.592 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.597 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ca3b0f55-e595-4bf8-ae2b-b75f3d9c5054]: (4, ('net.ipv4.conf.all.promote_secondaries = 1\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.621 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e7a8c1b2-acf3-420e-95a3-ebbeaac40287]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.625 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[1ea4c154-fbff-46a4-8eb5-badbb1b17a88]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.6261] manager: (tap30a3c425-f0): new Veth device (/org/freedesktop/NetworkManager/Devices/171)
Dec 11 01:22:51 np0005554845 systemd-udevd[226153]: Network interface NamePolicy= disabled on kernel command line.
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.651 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[e2083065-8f3c-4cb0-b4e2-ee649f36d754]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.654 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[21afe90b-b874-4f74-b007-d0e166a3609d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.6749] device (tap30a3c425-f0): carrier: link connected
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.678 213899 DEBUG oslo.privsep.daemon [-] privsep: reply[ace1c9fb-9676-4ed5-bec4-71abbf24fa61]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.693 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[ed7e9acd-bbd4-40b4-b2e4-3172857d1ad1]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30a3c425-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:8a:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444632, 'reachable_time': 21176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226182, 'error': None, 'target': 'ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.705 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c527202e-fe50-4456-996a-004303d8c5b2]: (4, ({'family': 10, 'prefixlen': 64, 'flags': 192, 'scope': 253, 'index': 2, 'attrs': [['IFA_ADDRESS', 'fe80::f816:3eff:fea7:8a55'], ['IFA_CACHEINFO', {'ifa_preferred': 4294967295, 'ifa_valid': 4294967295, 'cstamp': 444632, 'tstamp': 444632}], ['IFA_FLAGS', 192]], 'header': {'length': 72, 'type': 20, 'flags': 2, 'sequence_number': 255, 'pid': 226183, 'error': None, 'target': 'ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'stats': (0, 0, 0)}, 'event': 'RTM_NEWADDR'},)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.720 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[3425c8a6-6fc8-4b25-abc4-da1ba62ad21f]: (4, [{'family': 0, '__align': (), 'ifi_type': 1, 'index': 2, 'flags': 69699, 'change': 0, 'attrs': [['IFLA_IFNAME', 'tap30a3c425-f1'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UP'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 1500], ['IFLA_MIN_MTU', 68], ['IFLA_MAX_MTU', 65535], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 8], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 8], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 2], ['IFLA_CARRIER_UP_COUNT', 1], ['IFLA_CARRIER_DOWN_COUNT', 1], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', 'fa:16:3e:a7:8a:55'], ['IFLA_BROADCAST', 'ff:ff:ff:ff:ff:ff'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 90, 'tx_bytes': 90, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_LINKINFO', {'attrs': [['IFLA_INFO_KIND', 'veth']]}], ['IFLA_LINK_NETNSID', 0], ['IFLA_LINK', 100], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 0, 'nopolicy': 0, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444632, 'reachable_time': 21176, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 1500, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 0, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 1, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 1, 'inoctets': 76, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 1, 'outoctets': 76, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 1, 'outmcastpkts': 1, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 76, 'outmcastoctets': 76, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 1, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 1, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1448, 'type': 16, 'flags': 0, 'sequence_number': 255, 'pid': 226184, 'error': None, 'target': 'ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.749 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[b8d5a076-094a-4f83-87b1-cdc7834ab273]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.806 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[138fc06e-f970-4629-bd76-806b6b09fe63]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.808 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30a3c425-f0, bridge=br-ex, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.808 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Transaction caused no change do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:129#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.809 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): AddPortCommand(_result=None, bridge=br-int, port=tap30a3c425-f0, may_exist=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:51 np0005554845 kernel: tap30a3c425-f0: entered promiscuous mode
Dec 11 01:22:51 np0005554845 NetworkManager[55529]: <info>  [1765434171.8113] manager: (tap30a3c425-f0): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/172)
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.810 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.812 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.815 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Interface, record=tap30a3c425-f0, col_values=(('external_ids', {'iface-id': '758d7f57-8e11-417a-afa2-2250b564315d'}),)) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:51 np0005554845 ovn_controller[95428]: 2025-12-11T06:22:51Z|00330|binding|INFO|Releasing lport 758d7f57-8e11-417a-afa2-2250b564315d from this chassis (sb_readonly=0)
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.816 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.817 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.817 104320 DEBUG neutron.agent.linux.utils [-] Unable to access /var/lib/neutron/external/pids/30a3c425-ff4d-4be0-b139-7bb4a9781599.pid.haproxy; Error: [Errno 2] No such file or directory: '/var/lib/neutron/external/pids/30a3c425-ff4d-4be0-b139-7bb4a9781599.pid.haproxy' get_value_from_file /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:252#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.818 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[67de38d0-b37b-42a6-9228-0744af021e4d]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.819 104320 DEBUG neutron.agent.ovn.metadata.driver [-] haproxy_cfg = 
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: global
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    log         /dev/log local0 debug
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    log-tag     haproxy-metadata-proxy-30a3c425-ff4d-4be0-b139-7bb4a9781599
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    user        root
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    group       root
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    maxconn     1024
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    pidfile     /var/lib/neutron/external/pids/30a3c425-ff4d-4be0-b139-7bb4a9781599.pid.haproxy
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    daemon
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: defaults
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    log global
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    mode http
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    option httplog
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    option dontlognull
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    option http-server-close
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    option forwardfor
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    retries                 3
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    timeout http-request    30s
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    timeout connect         30s
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    timeout client          32s
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    timeout server          32s
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    timeout http-keep-alive 30s
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: listen listener
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    bind 169.254.169.254:80
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    server metadata /var/lib/neutron/metadata_proxy
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]:    http-request add-header X-OVN-Network-ID 30a3c425-ff4d-4be0-b139-7bb4a9781599
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: create_config_file /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/driver.py:107#033[00m
Dec 11 01:22:51 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:51.821 104320 DEBUG neutron.agent.linux.utils [-] Running command: ['sudo', 'neutron-rootwrap', '/etc/neutron/rootwrap.conf', 'ip', 'netns', 'exec', 'ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'env', 'PROCESS_TAG=haproxy-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'haproxy', '-f', '/var/lib/neutron/ovn-metadata-proxy/30a3c425-ff4d-4be0-b139-7bb4a9781599.conf'] create_process /usr/lib/python3.9/site-packages/neutron/agent/linux/utils.py:84#033[00m
Dec 11 01:22:51 np0005554845 nova_compute[187128]: 2025-12-11 06:22:51.832 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:52 np0005554845 podman[226216]: 2025-12-11 06:22:52.180276351 +0000 UTC m=+0.060106623 container create 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS)
Dec 11 01:22:52 np0005554845 systemd[1]: Started libpod-conmon-9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2.scope.
Dec 11 01:22:52 np0005554845 systemd[1]: Started libcrun container.
Dec 11 01:22:52 np0005554845 kernel: xfs filesystem being remounted at /var/lib/containers/storage/overlay/a51d8242b72d632e04f575323e57ce1d8a30f410da7f879f60a62e5e6571f8f8/merged/var/lib/neutron supports timestamps until 2038 (0x7fffffff)
Dec 11 01:22:52 np0005554845 podman[226216]: 2025-12-11 06:22:52.242909851 +0000 UTC m=+0.122740173 container init 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2)
Dec 11 01:22:52 np0005554845 podman[226216]: 2025-12-11 06:22:52.148821566 +0000 UTC m=+0.028651858 image pull 3695f0466b4af47afdf4b467956f8cc4744d7249671a73e7ca3fd26cca2f59c3 quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified
Dec 11 01:22:52 np0005554845 podman[226216]: 2025-12-11 06:22:52.248301967 +0000 UTC m=+0.128132259 container start 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, tcib_managed=true)
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.266 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434172.2654095, 7030a3a1-028c-4af8-a8bf-008a08a52227 => Started> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.267 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] VM Started (Lifecycle Event)#033[00m
Dec 11 01:22:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [NOTICE]   (226242) : New worker (226244) forked
Dec 11 01:22:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [NOTICE]   (226242) : Loading success.
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.287 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.291 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434172.2656944, 7030a3a1-028c-4af8-a8bf-008a08a52227 => Paused> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.291 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] VM Paused (Lifecycle Event)#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.308 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.311 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Synchronizing instance power state after lifecycle event "Paused"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 3 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:22:52 np0005554845 nova_compute[187128]: 2025-12-11 06:22:52.329 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:22:53 np0005554845 nova_compute[187128]: 2025-12-11 06:22:53.102 187132 DEBUG nova.network.neutron [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updated VIF entry in instance network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:22:53 np0005554845 nova_compute[187128]: 2025-12-11 06:22:53.103 187132 DEBUG nova.network.neutron [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": false, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:22:53 np0005554845 nova_compute[187128]: 2025-12-11 06:22:53.122 187132 DEBUG oslo_concurrency.lockutils [req-018db39f-55e3-4e2b-9617-754dde913379 req-7a2b0f26-6a38-40e8-be75-b7220e159a2b eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:22:55 np0005554845 nova_compute[187128]: 2025-12-11 06:22:55.160 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:56.155 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=21, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=20) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:22:56 np0005554845 nova_compute[187128]: 2025-12-11 06:22:56.156 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:56 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:56.157 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 2 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:22:56 np0005554845 nova_compute[187128]: 2025-12-11 06:22:56.232 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.207 187132 DEBUG nova.compute.manager [req-394ffdf1-f866-4cdd-bd92-234568dc00af req-3563ec2e-2d37-4cf6-923f-c45bc0dcd4e7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.208 187132 DEBUG oslo_concurrency.lockutils [req-394ffdf1-f866-4cdd-bd92-234568dc00af req-3563ec2e-2d37-4cf6-923f-c45bc0dcd4e7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.208 187132 DEBUG oslo_concurrency.lockutils [req-394ffdf1-f866-4cdd-bd92-234568dc00af req-3563ec2e-2d37-4cf6-923f-c45bc0dcd4e7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.209 187132 DEBUG oslo_concurrency.lockutils [req-394ffdf1-f866-4cdd-bd92-234568dc00af req-3563ec2e-2d37-4cf6-923f-c45bc0dcd4e7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.209 187132 DEBUG nova.compute.manager [req-394ffdf1-f866-4cdd-bd92-234568dc00af req-3563ec2e-2d37-4cf6-923f-c45bc0dcd4e7 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Processing event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10808#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.210 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Instance event wait completed in 4 seconds for network-vif-plugged wait_for_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:577#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.213 187132 DEBUG nova.virt.driver [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] Emitting event <LifecycleEvent: 1765434177.213701, 7030a3a1-028c-4af8-a8bf-008a08a52227 => Resumed> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.214 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] VM Resumed (Lifecycle Event)#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.216 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Guest created on hypervisor spawn /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:4417#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.219 187132 INFO nova.virt.libvirt.driver [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Instance spawned successfully.#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.220 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Attempting to register defaults for the following image properties: ['hw_cdrom_bus', 'hw_disk_bus', 'hw_input_bus', 'hw_pointer_model', 'hw_video_model', 'hw_vif_model'] _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:917#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.246 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.255 187132 DEBUG nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Synchronizing instance power state after lifecycle event "Resumed"; current vm_state: building, current task_state: spawning, current DB power_state: 0, VM power_state: 1 handle_lifecycle_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:1396#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.257 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_cdrom_bus of sata _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.257 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_disk_bus of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.258 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_input_bus of usb _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.258 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_pointer_model of usbtablet _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.258 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_video_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.259 187132 DEBUG nova.virt.libvirt.driver [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Found default for hw_vif_model of virtio _register_undefined_instance_details /usr/lib/python3.9/site-packages/nova/virt/libvirt/driver.py:946#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.299 187132 INFO nova.compute.manager [None req-f5fe5001-067a-46a4-bb88-9b9eca620cbb - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] During sync_power_state the instance has a pending task (spawning). Skip.#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.337 187132 INFO nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Took 14.59 seconds to spawn the instance on the hypervisor.#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.338 187132 DEBUG nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.404 187132 INFO nova.compute.manager [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Took 15.26 seconds to build instance.#033[00m
Dec 11 01:22:57 np0005554845 nova_compute[187128]: 2025-12-11 06:22:57.425 187132 DEBUG oslo_concurrency.lockutils [None req-4785dce1-6283-4d64-8fc1-ce45b15000ef 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227" "released" by "nova.compute.manager.ComputeManager.build_and_run_instance.<locals>._locked_do_build_and_run_instance" :: held 15.457s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:58 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:22:58.159 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '21'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.370 187132 DEBUG nova.compute.manager [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.371 187132 DEBUG oslo_concurrency.lockutils [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.371 187132 DEBUG oslo_concurrency.lockutils [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.372 187132 DEBUG oslo_concurrency.lockutils [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.372 187132 DEBUG nova.compute.manager [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] No waiting events found dispatching network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:22:59 np0005554845 nova_compute[187128]: 2025-12-11 06:22:59.373 187132 WARNING nova.compute.manager [req-468e0808-6d60-4b5f-9776-e260bb74f340 req-6eacfed3-846d-4a41-972a-8935a1b41a5c eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received unexpected event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 for instance with vm_state active and task_state None.#033[00m
Dec 11 01:23:00 np0005554845 nova_compute[187128]: 2025-12-11 06:23:00.163 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:01 np0005554845 nova_compute[187128]: 2025-12-11 06:23:01.233 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.054 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:02 np0005554845 NetworkManager[55529]: <info>  [1765434182.0554] manager: (patch-br-int-to-provnet-6001c188-4569-47cd-9788-b0996338163f): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/173)
Dec 11 01:23:02 np0005554845 NetworkManager[55529]: <info>  [1765434182.0564] manager: (patch-provnet-6001c188-4569-47cd-9788-b0996338163f-to-br-int): new Open vSwitch Port device (/org/freedesktop/NetworkManager/Devices/174)
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.148 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:02 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:02Z|00331|binding|INFO|Releasing lport 758d7f57-8e11-417a-afa2-2250b564315d from this chassis (sb_readonly=0)
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.162 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.564 187132 DEBUG nova.compute.manager [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.565 187132 DEBUG nova.compute.manager [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing instance network info cache due to event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.566 187132 DEBUG oslo_concurrency.lockutils [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.566 187132 DEBUG oslo_concurrency.lockutils [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:23:02 np0005554845 nova_compute[187128]: 2025-12-11 06:23:02.567 187132 DEBUG nova.network.neutron [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:23:03 np0005554845 nova_compute[187128]: 2025-12-11 06:23:03.894 187132 DEBUG nova.network.neutron [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updated VIF entry in instance network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:23:03 np0005554845 nova_compute[187128]: 2025-12-11 06:23:03.895 187132 DEBUG nova.network.neutron [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:23:03 np0005554845 nova_compute[187128]: 2025-12-11 06:23:03.912 187132 DEBUG oslo_concurrency.lockutils [req-902ab3fa-317f-453f-9071-b37a181c435a req-9bf87d60-9275-4570-b5a3-18280ad5b44d eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:23:04 np0005554845 nova_compute[187128]: 2025-12-11 06:23:04.015 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:04 np0005554845 podman[226254]: 2025-12-11 06:23:04.11114645 +0000 UTC m=+0.045720083 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:23:05 np0005554845 nova_compute[187128]: 2025-12-11 06:23:05.166 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:06 np0005554845 nova_compute[187128]: 2025-12-11 06:23:06.235 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:09Z|00048|pinctrl(ovn_pinctrl0)|INFO|DHCPOFFER fa:16:3e:e3:89:e4 10.100.0.3
Dec 11 01:23:09 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:09Z|00049|pinctrl(ovn_pinctrl0)|INFO|DHCPACK fa:16:3e:e3:89:e4 10.100.0.3
Dec 11 01:23:10 np0005554845 podman[226298]: 2025-12-11 06:23:10.134270632 +0000 UTC m=+0.070766073 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:23:10 np0005554845 nova_compute[187128]: 2025-12-11 06:23:10.171 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:11 np0005554845 nova_compute[187128]: 2025-12-11 06:23:11.237 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:12 np0005554845 podman[226318]: 2025-12-11 06:23:12.131327699 +0000 UTC m=+0.058590922 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:23:12 np0005554845 podman[226319]: 2025-12-11 06:23:12.182240921 +0000 UTC m=+0.097686993 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_managed=true, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:23:15 np0005554845 nova_compute[187128]: 2025-12-11 06:23:15.193 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:16 np0005554845 podman[226363]: 2025-12-11 06:23:16.144083222 +0000 UTC m=+0.073642771 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:23:16 np0005554845 nova_compute[187128]: 2025-12-11 06:23:16.286 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:20 np0005554845 nova_compute[187128]: 2025-12-11 06:23:20.196 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:21 np0005554845 podman[226383]: 2025-12-11 06:23:21.136107404 +0000 UTC m=+0.063569437 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:23:21 np0005554845 podman[226384]: 2025-12-11 06:23:21.140540234 +0000 UTC m=+0.069254741 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, maintainer=Red Hat, Inc., config_id=edpm, container_name=openstack_network_exporter, architecture=x86_64, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9, version=9.6, vcs-type=git, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, distribution-scope=public, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:23:21 np0005554845 nova_compute[187128]: 2025-12-11 06:23:21.289 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:25 np0005554845 nova_compute[187128]: 2025-12-11 06:23:25.200 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:26.235 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:26.236 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:26.236 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:26 np0005554845 nova_compute[187128]: 2025-12-11 06:23:26.292 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.103 12 DEBUG ceilometer.compute.discovery [-] instance data: {'id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'os_type': 'hvm', 'architecture': 'x86_64', 'OS-EXT-SRV-ATTR:instance_name': 'instance-00000036', 'OS-EXT-SRV-ATTR:host': 'compute-2.ctlplane.example.com', 'OS-EXT-STS:vm_state': 'running', 'tenant_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'user_id': '78548cbaea0e406ebb716882c382c954', 'hostId': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'status': 'active', 'metadata': {}} discover_libvirt_polling /usr/lib/python3.9/site-packages/ceilometer/compute/discovery.py:228
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.103 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.bytes in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.147 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.bytes volume: 72945664 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.148 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.bytes volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8f49c73-f919-4b7c-8816-f34efbd4be0c', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 72945664, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.103985', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e934f88a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '130629947e4f3a2ec5a17d0cb0ebc39fa944f1240fa58da0bbe1b2046b3c0698'}, {'source': 'openstack', 'counter_name': 'disk.device.write.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.103985', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e9350730-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '1a68580c34382d5d08d6ce313cfb10377c500f5c16d16a25ad64bc75cc189c62'}]}, 'timestamp': '2025-12-11 06:23:30.148652', '_unique_id': 'df7005be8c094c499fbe5add797519e0'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.149 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.150 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.allocation in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.165 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.allocation volume: 30089216 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.166 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.allocation volume: 487424 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b9a5f9bb-040f-4570-a1d8-3ba7095f88fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 30089216, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.150958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e937bdd6-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': '776a292426490c40df08825e74a9d70ce9b4bd72d0c4dd09e791855730320fcc'}, {'source': 'openstack', 'counter_name': 'disk.device.allocation', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 487424, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.150958', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e937cc68-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': '6b2200279f8dda19a772445568e96fa5748e6126ee6e67a8305ec953352286f9'}]}, 'timestamp': '2025-12-11 06:23:30.166814', '_unique_id': '8a20a8a0c98c49cfabbea562ce7679ba'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.167 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.169 12 INFO ceilometer.polling.manager [-] Polling pollster memory.usage in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.193 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/memory.usage volume: 46.88671875 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ac61823e-80c3-4efb-884a-d6c87d0c541d', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'memory.usage', 'counter_type': 'gauge', 'counter_unit': 'MB', 'counter_volume': 46.88671875, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'timestamp': '2025-12-11T06:23:30.169159', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1}, 'message_id': 'e93bf978-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.89520142, 'message_signature': '2169b18db773ca92885b564e37c26ec898d9503eefb3137011d1a92f622c0061'}]}, 'timestamp': '2025-12-11 06:23:30.194321', '_unique_id': '5f0510279b6540d29655d83fd7a98876'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.195 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.196 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.requests in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.196 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.requests volume: 1105 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.197 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.requests volume: 108 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'a382e494-d2e2-4c6e-89d7-63c43af20acf', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 1105, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.196938', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e93c72f4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '528fa8a277c7b5129c3c1a73049f6c222346791946fe1045ae6a2eabc193913c'}, {'source': 'openstack', 'counter_name': 'disk.device.read.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 108, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.196938', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e93c8244-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': 'e5982766181ed940b85bbba4f319a34bd12fc389972b9c4a0337f42347321e62'}]}, 'timestamp': '2025-12-11 06:23:30.197708', '_unique_id': '7b45ed2f7dcd48e9a217a4d087a9c8dc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.198 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.199 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.error in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.202 12 DEBUG ceilometer.compute.virt.libvirt.inspector [-] No delta meter predecessor for 7030a3a1-028c-4af8-a8bf-008a08a52227 / tap7b9f5b7c-eb inspect_vnics /usr/lib/python3.9/site-packages/ceilometer/compute/virt/libvirt/inspector.py:136
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.202 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.incoming.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 nova_compute[187128]: 2025-12-11 06:23:30.203 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '8391629e-0052-4400-8f8e-86ccff86ad21', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.200063', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e93d5944-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '488045388c74362bbb920b29db38a311bb2f205aa09e81ca7abaceb092d7b892'}]}, 'timestamp': '2025-12-11 06:23:30.203239', '_unique_id': '9cd426698ce64bb1ae84f152f4179a52'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.204 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.205 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.205 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.outgoing.bytes volume: 5882 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4a4aa04f-aad0-49bf-bd53-a9986216a923', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 5882, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.205822', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e93dd2d4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '86fd982e01717de606e50be1f0731734b1fc92bec8f543b5ed20edc24763c4f8'}]}, 'timestamp': '2025-12-11 06:23:30.206351', '_unique_id': 'f481508ba570403d9d5a66579bd0c699'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 rsyslogd[1002]: imjournal: journal files changed, reloading...  [v8.2510.0-2.el9 try https://www.rsyslog.com/e/0 ]
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.207 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.208 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.latency in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.209 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.latency volume: 4813793018 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.209 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.latency volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '89b02272-0fc6-498e-a075-431f37465b25', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 4813793018, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.209140', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e93e515a-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '33bd246e638fcafe95e725b32fce06d7fbea89a65d967d68234e11685b7ccd63'}, {'source': 'openstack', 'counter_name': 'disk.device.write.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.209140', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e93e60d2-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '696739b992b1c994ba9ce1334cce6b2cf82f6f6d55746c9116755e8c86951384'}]}, 'timestamp': '2025-12-11 06:23:30.209965', '_unique_id': '79328e2fde8040ec8fbe88c6f5cffa93'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.210 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.212 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.212 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.incoming.packets volume: 42 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd496ec7c-b77c-433e-b22b-55611825b2fe', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 42, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.212646', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e93ed95e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': 'aa7dd41aa28e43c4bc0a630e0c367778274651be9d74ee0ec73a79f096a3443f'}]}, 'timestamp': '2025-12-11 06:23:30.213081', '_unique_id': '58da79df5a0c4007af5fd285771436cc'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.213 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.215 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.error in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.215 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.outgoing.packets.error volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2f4599ff-eb97-4a56-afcb-2f43adae121f', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.error', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.215578', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e93f4b6e-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': 'cf5c0847c4ff460e7ba482044d4e7e73b6b45601b28ba2178e30f9cf9851db6d'}]}, 'timestamp': '2025-12-11 06:23:30.216013', '_unique_id': '7b53079968d74714bc0b1e3f8a6c9021'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.216 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.218 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.delta in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.218 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.outgoing.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '15aa89fe-7306-42c1-9085-76c5b25a7017', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.218563', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e93fbf86-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '10ea0c9ba4c4890dd899954afb986e96e79818cdce0afbc8660e8169cae363f0'}]}, 'timestamp': '2025-12-11 06:23:30.218959', '_unique_id': '83d7ead489b5451f80ba8aaf788b38e3'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.219 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.latency in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.221 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskLatencyPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.221 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.latency from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>]
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.221 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.bytes in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.221 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.bytes volume: 30525952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.222 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.bytes volume: 274750 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '4b819d40-4e6c-457e-bf99-30601d7ade9a', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 30525952, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.221818', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e94043de-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': 'fc9f67605c8f1a6b679840ade3f791a31eaaf31f528320a04a2358f48e5f3f53'}, {'source': 'openstack', 'counter_name': 'disk.device.read.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 274750, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.221818', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e94053c4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '314d0a42539ecd32b2a509068bd7fed2c95ed8c83b3c100a6344e9a385ae8b3e'}]}, 'timestamp': '2025-12-11 06:23:30.222730', '_unique_id': 'ce4d15b407684dcaa8ab87b029000825'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.223 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.224 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.capacity in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.224 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.capacity volume: 1073741824 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.225 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.capacity volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'b79a6d45-d4cb-4f3f-9be8-f07a022e8d74', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 1073741824, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.224908', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e940b724-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': 'f8b0a7dad19025ad9188398c229ddc182706090fcbde0d46ab2eda7952b7df23'}, {'source': 'openstack', 'counter_name': 'disk.device.capacity', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.224908', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e940c5ca-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': '4bec2b77a6f27a25b06ed799d356a139d7172ff9639b0c478ccffbd56a6f7d51'}]}, 'timestamp': '2025-12-11 06:23:30.225648', '_unique_id': '5f61da85595748b2ab77778c0ca3e0fe'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.226 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.227 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.read.latency in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.227 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.latency volume: 185483730 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.228 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.read.latency volume: 27210560 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'ef7926ea-ca95-45a8-9d39-ba937bb6ea82', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 185483730, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.227821', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e94128bc-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '60cb07e7f2d1712580622b4c32266339bdd1986a1b97d195ec6fe42caeeb5221'}, {'source': 'openstack', 'counter_name': 'disk.device.read.latency', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 27210560, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.227821', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e9413aaa-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '184a49bd2f3f9e7806d55e88730c502b83a5da247014af88845a07571ca15380'}]}, 'timestamp': '2025-12-11 06:23:30.228644', '_unique_id': 'fec98deaac1e48bb9f3f9b772680c756'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.229 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.230 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.usage in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.231 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.usage volume: 29949952 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.231 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.usage volume: 485376 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'e436fc0a-6555-40c7-9152-4c6e4643c769', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 29949952, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.231002', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e941a544-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': '74a0062f837ad9211a20fbb0ec8dac0adddf1b32ed2948023417eeddab14fb7d'}, {'source': 'openstack', 'counter_name': 'disk.device.usage', 'counter_type': 'gauge', 'counter_unit': 'B', 'counter_volume': 485376, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.231002', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e941b3f4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.852646735, 'message_signature': '1d5c8bd31f4c59bc4a2b2172c4a11df706492ec55c5f95447047409668468e17'}]}, 'timestamp': '2025-12-11 06:23:30.231743', '_unique_id': '08c38e750b9845a4a77dd8cbff41e280'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.232 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.iops in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for PerDeviceDiskIOPSPollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 ERROR ceilometer.polling.manager [-] Prevent pollster disk.device.iops from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>]
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.rate in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for IncomingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.234 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.incoming.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>]
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.235 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets.drop in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.235 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.outgoing.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'cde8fe6a-92f5-40c4-a213-b3d3c4e001db', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.235322', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e9424f08-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': 'bc5b1566a13153dfde04456e40d357113362c5186489fba25eff3222f4005afa'}]}, 'timestamp': '2025-12-11 06:23:30.235732', '_unique_id': '729a71a9c12341ed9d4a31e62a75daab'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.236 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.238 12 INFO ceilometer.polling.manager [-] Polling pollster cpu in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.238 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/cpu volume: 11880000000 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': 'd8652e5c-125a-4bf8-bb94-9117ddbb195b', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'cpu', 'counter_type': 'cumulative', 'counter_unit': 'ns', 'counter_volume': 11880000000, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'timestamp': '2025-12-11T06:23:30.238535', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'cpu_number': 1}, 'message_id': 'e942d6e4-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.89520142, 'message_signature': 'd3184bb62a99e85730fecef25420cef467dd3d73a5f909b3ca718881b1f9ee94'}]}, 'timestamp': '2025-12-11 06:23:30.239639', '_unique_id': '190cbf037d854c7191afa329cf2a35a1'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.241 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.242 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.packets in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.243 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.outgoing.packets volume: 45 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '1bf0983f-dc8c-4312-9c02-67f4f125ad05', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.outgoing.packets', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 45, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.243076', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e9437f90-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': 'b8561ec05f66284e9f7b209058df974450d8c3cf718804871b77124a903c15b9'}]}, 'timestamp': '2025-12-11 06:23:30.243670', '_unique_id': '6b261f233d6c4943af8103912f2aeabf'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.244 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.246 12 INFO ceilometer.polling.manager [-] Polling pollster disk.device.write.requests in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.246 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.requests volume: 308 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.247 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/disk.device.write.requests volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '253fed15-105e-4864-8278-fa7816a85a3e', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 308, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-vda', 'timestamp': '2025-12-11T06:23:30.246643', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'vda'}, 'message_id': 'e9440a64-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': '7760309453861ff0a1770905a24ebbdfffcc66229db69febfeb07f32547c5c5f'}, {'source': 'openstack', 'counter_name': 'disk.device.write.requests', 'counter_type': 'cumulative', 'counter_unit': 'request', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': '7030a3a1-028c-4af8-a8bf-008a08a52227-sda', 'timestamp': '2025-12-11T06:23:30.246643', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'instance-00000036', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'disk_name': 'sda'}, 'message_id': 'e9441ea0-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.805668029, 'message_signature': 'a8c08cdea54b8a22b9bb4605fab08d504c1a36b00ce7b02b0381ffaa4c36622d'}]}, 'timestamp': '2025-12-11 06:23:30.247724', '_unique_id': 'fc75160bf7dc450e9bc69788dea27276'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.248 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.250 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.packets.drop in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.250 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.incoming.packets.drop volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '5b170832-eca6-457d-b5ae-d9f85c8b62f7', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.packets.drop', 'counter_type': 'cumulative', 'counter_unit': 'packet', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.250885', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e944b022-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '80b36ea33c042379e9ccedcfab373b35fd8207010852690b4d44f6f45a32f14f'}]}, 'timestamp': '2025-12-11 06:23:30.251465', '_unique_id': '4f54ed8b41024f69a01efcbba8d269f8'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.252 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.254 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.254 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.incoming.bytes volume: 7284 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '147049df-a8c2-46d3-a506-e15b29d1a006', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes', 'counter_type': 'cumulative', 'counter_unit': 'B', 'counter_volume': 7284, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.254751', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e9454744-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '627d223b7d2a08561e9e9d196ac2ca1fb9989b155793b36cb7641a6fe9316088'}]}, 'timestamp': '2025-12-11 06:23:30.255312', '_unique_id': 'aea83d484b8a40a185dfc47fa98a1d54'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.256 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.258 12 INFO ceilometer.polling.manager [-] Polling pollster network.incoming.bytes.delta in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.259 12 DEBUG ceilometer.compute.pollsters [-] 7030a3a1-028c-4af8-a8bf-008a08a52227/network.incoming.bytes.delta volume: 0 _stats_to_sample /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:108
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging [-] Could not send notification to notifications. Payload={'message_id': '2822b058-1061-40ea-8b45-f211253b1903', 'publisher_id': 'ceilometer.polling', 'event_type': 'telemetry.polling', 'priority': 'SAMPLE', 'payload': {'samples': [{'source': 'openstack', 'counter_name': 'network.incoming.bytes.delta', 'counter_type': 'delta', 'counter_unit': 'B', 'counter_volume': 0, 'user_id': '78548cbaea0e406ebb716882c382c954', 'user_name': None, 'project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'project_name': None, 'resource_id': 'instance-00000036-7030a3a1-028c-4af8-a8bf-008a08a52227-tap7b9f5b7c-eb', 'timestamp': '2025-12-11T06:23:30.259710', 'resource_metadata': {'display_name': 'tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649', 'name': 'tap7b9f5b7c-eb', 'instance_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'instance_type': 'm1.nano', 'host': '5a9a6b0a395e79f3cf743400fa3b1d5d5114af7b64e592ad83c23ca4', 'instance_host': 'compute-2.ctlplane.example.com', 'flavor': {'id': '604ddafe-0c56-4202-93c6-01236db9ae98', 'name': 'm1.nano', 'vcpus': 1, 'ram': 128, 'disk': 1, 'ephemeral': 0, 'swap': 0}, 'status': 'active', 'state': 'running', 'task_state': '', 'image': {'id': '8999c077-a9de-4930-873b-81a3bd2d6c5f'}, 'image_ref': '8999c077-a9de-4930-873b-81a3bd2d6c5f', 'image_ref_url': None, 'architecture': 'x86_64', 'os_type': 'hvm', 'vcpus': 1, 'memory_mb': 128, 'disk_gb': 1, 'ephemeral_gb': 0, 'root_gb': 1, 'mac': 'fa:16:3e:e3:89:e4', 'fref': None, 'parameters': {'interfaceid': None, 'bridge': None}, 'vnic_name': 'tap7b9f5b7c-eb'}, 'message_id': 'e9460e04-d659-11f0-a1f6-fa163e957a1e', 'monotonic_time': 4484.901756739, 'message_signature': '0dc209f842e152a738ecfb44b05acbce7158989d8ccf84f5c6ce2bb121d8584c'}]}, 'timestamp': '2025-12-11 06:23:30.260545', '_unique_id': '3cfae641b294415f8f006f34a06896e6'}: kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 446, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     yield
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/utils/functional.py", line 312, in retry_over_time
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return fun(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 877, in _connection_factory
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self._connection = self._establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 812, in _establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     conn = self.transport.establish_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/transport/pyamqp.py", line 201, in establish_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     conn.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/connection.py", line 323, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.transport.connect()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 129, in connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self._connect(self.host, self.port, self.connect_timeout)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/amqp/transport.py", line 184, in _connect
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.sock.connect(sa)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging ConnectionRefusedError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging The above exception was the direct cause of the following exception:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging Traceback (most recent call last):
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/notify/messaging.py", line 78, in notify
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.transport._send_notification(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/transport.py", line 134, in _send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self._driver.send_notification(target, ctxt, message, version,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 694, in send_notification
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return self._send(target, ctxt, message,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 653, in _send
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     with self._get_connection(rpc_common.PURPOSE_SEND, retry) as conn:
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/amqpdriver.py", line 605, in _get_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return rpc_common.ConnectionContext(self._connection_pool,
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/common.py", line 423, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.connection = connection_pool.get(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 98, in get
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return self.create(retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/pool.py", line 135, in create
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return self.connection_cls(self.conf, self.url, purpose, retry=retry)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 826, in __init__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.ensure_connection()
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/oslo_messaging/_drivers/impl_rabbit.py", line 957, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.connection.ensure_connection(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 381, in ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self._ensure_connection(*args, **kwargs)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 433, in _ensure_connection
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     return retry_over_time(
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib64/python3.9/contextlib.py", line 137, in __exit__
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     self.gen.throw(typ, value, traceback)
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging   File "/usr/lib/python3.9/site-packages/kombu/connection.py", line 450, in _reraise_as_library_errors
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging     raise ConnectionError(str(exc)) from exc
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging kombu.exceptions.OperationalError: [Errno 111] Connection refused
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.262 12 ERROR oslo_messaging.notify.messaging 
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.263 12 INFO ceilometer.polling.manager [-] Polling pollster network.outgoing.bytes.rate in the context of pollsters
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.264 12 DEBUG ceilometer.compute.pollsters [-] LibvirtInspector does not provide data for OutgoingBytesRatePollster get_samples /usr/lib/python3.9/site-packages/ceilometer/compute/pollsters/__init__.py:163
Dec 11 01:23:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:23:30.264 12 ERROR ceilometer.polling.manager [-] Prevent pollster network.outgoing.bytes.rate from polling [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>] on source pollsters anymore!: ceilometer.polling.plugin_base.PollsterPermanentError: [<NovaLikeServer: tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649>]
Dec 11 01:23:31 np0005554845 nova_compute[187128]: 2025-12-11 06:23:31.294 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:35 np0005554845 podman[226428]: 2025-12-11 06:23:35.128919882 +0000 UTC m=+0.064314938 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:23:35 np0005554845 nova_compute[187128]: 2025-12-11 06:23:35.205 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:36 np0005554845 nova_compute[187128]: 2025-12-11 06:23:36.297 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:36 np0005554845 nova_compute[187128]: 2025-12-11 06:23:36.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.717 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.717 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.806 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.899 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json" returned: 0 in 0.092s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.900 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running cmd (subprocess): /usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:23:38 np0005554845 nova_compute[187128]: 2025-12-11 06:23:38.958 187132 DEBUG oslo_concurrency.processutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CMD "/usr/bin/python3 -m oslo_concurrency.prlimit --as=1073741824 --cpu=30 -- env LC_ALL=C LANG=C qemu-img info /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227/disk --force-share --output=json" returned: 0 in 0.059s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.142 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.144 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5546MB free_disk=73.25526809692383GB free_vcpus=7 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.144 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.144 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.234 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Instance 7030a3a1-028c-4af8-a8bf-008a08a52227 actively managed on this compute host and has allocations in placement: {'resources': {'DISK_GB': 1, 'MEMORY_MB': 128, 'VCPU': 1}}. _remove_deleted_instances_allocations /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1635#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.234 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 1 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.234 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=640MB phys_disk=79GB used_disk=1GB total_vcpus=8 used_vcpus=1 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.325 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.343 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.373 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:23:39 np0005554845 nova_compute[187128]: 2025-12-11 06:23:39.373 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.229s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:40 np0005554845 nova_compute[187128]: 2025-12-11 06:23:40.209 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:40 np0005554845 nova_compute[187128]: 2025-12-11 06:23:40.368 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:40 np0005554845 nova_compute[187128]: 2025-12-11 06:23:40.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:41 np0005554845 podman[226458]: 2025-12-11 06:23:41.147616733 +0000 UTC m=+0.076697894 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_id=edpm, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ceilometer_agent_compute, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202)
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.298 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.910 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.911 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquired lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.911 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Forcefully refreshing network info cache for instance _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2004#033[00m
Dec 11 01:23:41 np0005554845 nova_compute[187128]: 2025-12-11 06:23:41.912 187132 DEBUG nova.objects.instance [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lazy-loading 'info_cache' on Instance uuid 7030a3a1-028c-4af8-a8bf-008a08a52227 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:23:43 np0005554845 podman[226478]: 2025-12-11 06:23:43.140782104 +0000 UTC m=+0.060956955 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:23:43 np0005554845 podman[226479]: 2025-12-11 06:23:43.172503236 +0000 UTC m=+0.094018054 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2, config_id=ovn_controller)
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.265 187132 DEBUG nova.network.neutron [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.293 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Releasing lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.293 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updated the network info_cache for instance _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9929#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.294 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.294 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.294 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.295 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:23:43 np0005554845 nova_compute[187128]: 2025-12-11 06:23:43.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:45 np0005554845 nova_compute[187128]: 2025-12-11 06:23:45.214 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:45 np0005554845 nova_compute[187128]: 2025-12-11 06:23:45.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:45 np0005554845 nova_compute[187128]: 2025-12-11 06:23:45.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:23:45 np0005554845 nova_compute[187128]: 2025-12-11 06:23:45.711 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:23:46 np0005554845 nova_compute[187128]: 2025-12-11 06:23:46.302 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:47 np0005554845 podman[226519]: 2025-12-11 06:23:47.158201483 +0000 UTC m=+0.079030228 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_id=multipathd, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0)
Dec 11 01:23:50 np0005554845 nova_compute[187128]: 2025-12-11 06:23:50.218 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.305 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:51 np0005554845 podman[226539]: 2025-12-11 06:23:51.420787227 +0000 UTC m=+0.069276823 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:23:51 np0005554845 podman[226540]: 2025-12-11 06:23:51.421352572 +0000 UTC m=+0.067356970 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, build-date=2025-08-20T13:12:41, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, vendor=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_id=edpm, io.openshift.expose-services=, name=ubi9-minimal, maintainer=Red Hat, Inc., io.buildah.version=1.33.7, architecture=x86_64, com.redhat.component=ubi9-minimal-container, io.openshift.tags=minimal rhel9, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.757 187132 DEBUG nova.compute.manager [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.758 187132 DEBUG nova.compute.manager [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing instance network info cache due to event network-changed-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11053#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.758 187132 DEBUG oslo_concurrency.lockutils [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:312#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.758 187132 DEBUG oslo_concurrency.lockutils [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquired lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:315#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.758 187132 DEBUG nova.network.neutron [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Refreshing network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 _get_instance_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:2007#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.968 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.969 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227" acquired by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.969 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.970 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.970 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.clear_events_for_instance.<locals>._clear_events" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.972 187132 INFO nova.compute.manager [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Terminating instance#033[00m
Dec 11 01:23:51 np0005554845 nova_compute[187128]: 2025-12-11 06:23:51.973 187132 DEBUG nova.compute.manager [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Start destroying the instance on the hypervisor. _shutdown_instance /usr/lib/python3.9/site-packages/nova/compute/manager.py:3120#033[00m
Dec 11 01:23:51 np0005554845 kernel: tap7b9f5b7c-eb (unregistering): left promiscuous mode
Dec 11 01:23:52 np0005554845 NetworkManager[55529]: <info>  [1765434232.0031] device (tap7b9f5b7c-eb): state change: disconnected -> unmanaged (reason 'unmanaged', managed-type: 'removed')
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.015 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:52Z|00332|binding|INFO|Releasing lport 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 from this chassis (sb_readonly=0)
Dec 11 01:23:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:52Z|00333|binding|INFO|Setting lport 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 down in Southbound
Dec 11 01:23:52 np0005554845 ovn_controller[95428]: 2025-12-11T06:23:52Z|00334|binding|INFO|Removing iface tap7b9f5b7c-eb ovn-installed in OVS
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.019 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.027 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: PortBindingUpdatedEvent(events=('update',), table='Port_Binding', conditions=None, old_conditions=None), priority=20 to row=Port_Binding(mac=['fa:16:3e:e3:89:e4 10.100.0.3'], port_security=['fa:16:3e:e3:89:e4 10.100.0.3'], type=, nat_addresses=[], virtual_parent=[], up=[False], options={'requested-chassis': 'compute-2.ctlplane.example.com'}, parent_port=[], requested_additional_chassis=[], ha_chassis_group=[], external_ids={'neutron:cidrs': '10.100.0.3/28', 'neutron:device_id': '7030a3a1-028c-4af8-a8bf-008a08a52227', 'neutron:device_owner': 'compute:nova', 'neutron:mtu': '', 'neutron:network_name': 'neutron-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'neutron:port_capabilities': '', 'neutron:port_name': '', 'neutron:project_id': '9d8630abd3cd4aef89d0b1af6e62ac93', 'neutron:revision_number': '4', 'neutron:security_group_ids': '40d159e8-a87c-4e2a-ad3b-023cf01f5577 c95e8f7e-138d-43e5-9624-cbbdd47e0fbb', 'neutron:subnet_pool_addr_scope4': '', 'neutron:subnet_pool_addr_scope6': '', 'neutron:vnic_type': 'normal', 'neutron:host_id': 'compute-2.ctlplane.example.com'}, additional_chassis=[], tag=[], additional_encap=[], encap=[], mirror_rules=[], datapath=0c3b1014-bbb0-4802-b8e5-65838069075c, chassis=[], tunnel_key=3, gateway_chassis=[], requested_chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>], logical_port=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6) old=Port_Binding(up=[True], chassis=[<ovs.db.idl.Row object at 0x7f1b90327d90>]) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.028 104320 INFO neutron.agent.ovn.metadata.agent [-] Port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 in datapath 30a3c425-ff4d-4be0-b139-7bb4a9781599 unbound from our chassis#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.030 104320 DEBUG neutron.agent.ovn.metadata.agent [-] No valid VIF ports were found for network 30a3c425-ff4d-4be0-b139-7bb4a9781599, tearing the namespace down if needed _get_provision_params /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:628#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.033 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c5ff6b66-5674-43bf-b256-b54493fc5fbd]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.035 104320 INFO neutron.agent.ovn.metadata.agent [-] Cleaning up ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599 namespace which is not needed anymore#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.041 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000036.scope: Deactivated successfully.
Dec 11 01:23:52 np0005554845 systemd[1]: machine-qemu\x2d26\x2dinstance\x2d00000036.scope: Consumed 15.394s CPU time.
Dec 11 01:23:52 np0005554845 systemd-machined[153381]: Machine qemu-26-instance-00000036 terminated.
Dec 11 01:23:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [NOTICE]   (226242) : haproxy version is 2.8.14-c23fe91
Dec 11 01:23:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [NOTICE]   (226242) : path to executable is /usr/sbin/haproxy
Dec 11 01:23:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [WARNING]  (226242) : Exiting Master process...
Dec 11 01:23:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [ALERT]    (226242) : Current worker (226244) exited with code 143 (Terminated)
Dec 11 01:23:52 np0005554845 neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599[226237]: [WARNING]  (226242) : All workers exited. Exiting... (0)
Dec 11 01:23:52 np0005554845 systemd[1]: libpod-9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2.scope: Deactivated successfully.
Dec 11 01:23:52 np0005554845 conmon[226237]: conmon 9d16f42fee1570c96cbb <nwarn>: Failed to open cgroups file: /sys/fs/cgroup/machine.slice/libpod-9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2.scope/container/memory.events
Dec 11 01:23:52 np0005554845 podman[226609]: 2025-12-11 06:23:52.202804372 +0000 UTC m=+0.054708917 container died 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:23:52 np0005554845 systemd[1]: var-lib-containers-storage-overlay\x2dcontainers-9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2-userdata-shm.mount: Deactivated successfully.
Dec 11 01:23:52 np0005554845 systemd[1]: var-lib-containers-storage-overlay-a51d8242b72d632e04f575323e57ce1d8a30f410da7f879f60a62e5e6571f8f8-merged.mount: Deactivated successfully.
Dec 11 01:23:52 np0005554845 podman[226609]: 2025-12-11 06:23:52.248146123 +0000 UTC m=+0.100050678 container cleanup 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.264 187132 INFO nova.virt.libvirt.driver [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Instance destroyed successfully.#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.265 187132 DEBUG nova.objects.instance [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lazy-loading 'resources' on Instance uuid 7030a3a1-028c-4af8-a8bf-008a08a52227 obj_load_attr /usr/lib/python3.9/site-packages/nova/objects/instance.py:1105#033[00m
Dec 11 01:23:52 np0005554845 systemd[1]: libpod-conmon-9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2.scope: Deactivated successfully.
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.278 187132 DEBUG nova.virt.libvirt.vif [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] vif_type=ovs instance=Instance(access_ip_v4=None,access_ip_v6=None,architecture=None,auto_disk_config=False,availability_zone='nova',cell_name=None,cleaned=False,config_drive='True',created_at=2025-12-11T06:22:41Z,default_ephemeral_device=None,default_swap_device=None,deleted=False,deleted_at=None,device_metadata=<?>,disable_terminate=False,display_description='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',display_name='tempest-server-tempest-TestSecurityGroupsBasicOps-2036320412-access_point-288737649',ec2_ids=<?>,ephemeral_gb=0,ephemeral_key_uuid=None,fault=<?>,flavor=Flavor(1),hidden=False,host='compute-2.ctlplane.example.com',hostname='tempest-server-tempest-testsecuritygroupsbasicops-2036320412-ac',id=54,image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',info_cache=InstanceInfoCache,instance_type_id=1,kernel_id='',key_data='ecdsa-sha2-nistp384 AAAAE2VjZHNhLXNoYTItbmlzdHAzODQAAAAIbmlzdHAzODQAAABhBCcBU2AiCiG43plJXeuCC5swRewcLeFXgxthvl4QIA/JXvysQvegVCEFCqn/SWbo/hhdJCKug/ac2zzwPBAlkyrvklynj1zrxLiffMjoakg13WIhbQT3eDWM+5WvMrCymQ==',key_name='tempest-TestSecurityGroupsBasicOps-1772198906',keypairs=<?>,launch_index=0,launched_at=2025-12-11T06:22:57Z,launched_on='compute-2.ctlplane.example.com',locked=False,locked_by=None,memory_mb=128,metadata={},migration_context=<?>,new_flavor=None,node='compute-2.ctlplane.example.com',numa_topology=None,old_flavor=None,os_type=None,pci_devices=<?>,pci_requests=<?>,power_state=1,progress=0,project_id='9d8630abd3cd4aef89d0b1af6e62ac93',ramdisk_id='',reservation_id='r-e2124y2j',resources=None,root_device_name='/dev/vda',root_gb=1,security_groups=SecurityGroupList,services=<?>,shutdown_terminate=False,system_metadata={boot_roles='reader,member',image_base_image_ref='8999c077-a9de-4930-873b-81a3bd2d6c5f',image_container_format='bare',image_disk_format='qcow2',image_hw_cdrom_bus='sata',image_hw_disk_bus='virtio',image_hw_input_bus='usb',image_hw_machine_type='q35',image_hw_pointer_model='usbtablet',image_hw_rng_model='virtio',image_hw_video_model='virtio',image_hw_vif_model='virtio',image_min_disk='1',image_min_ram='0',owner_project_name='tempest-TestSecurityGroupsBasicOps-2036320412',owner_user_name='tempest-TestSecurityGroupsBasicOps-2036320412-project-member'},tags=<?>,task_state='deleting',terminated_at=None,trusted_certs=<?>,updated_at=2025-12-11T06:22:57Z,user_data=None,user_id='78548cbaea0e406ebb716882c382c954',uuid=7030a3a1-028c-4af8-a8bf-008a08a52227,vcpu_model=<?>,vcpus=1,vm_mode=None,vm_state='active') vif={"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} unplug /usr/lib/python3.9/site-packages/nova/virt/libvirt/vif.py:828#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.279 187132 DEBUG nova.network.os_vif_util [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converting VIF {"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": [{"address": "192.168.122.193", "type": "floating", "version": 4, "meta": {}}]}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}} nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:511#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.280 187132 DEBUG nova.network.os_vif_util [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Converted object VIFOpenVSwitch(active=True,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb') nova_to_osvif_vif /usr/lib/python3.9/site-packages/nova/network/os_vif_util.py:548#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.281 187132 DEBUG os_vif [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Unplugging vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb') unplug /usr/lib/python3.9/site-packages/os_vif/__init__.py:109#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.284 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 20 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.284 187132 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap7b9f5b7c-eb, bridge=br-int, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.286 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.288 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] 0-ms timeout __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:248#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.289 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.294 187132 INFO os_vif [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Successfully unplugged vif VIFOpenVSwitch(active=True,address=fa:16:3e:e3:89:e4,bridge_name='br-int',has_traffic_filtering=True,id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6,network=Network(30a3c425-ff4d-4be0-b139-7bb4a9781599),plugin='ovs',port_profile=VIFPortProfileOpenVSwitch,preserve_on_delete=False,vif_name='tap7b9f5b7c-eb')#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.295 187132 INFO nova.virt.libvirt.driver [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Deleting instance files /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227_del#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.296 187132 INFO nova.virt.libvirt.driver [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Deletion of /var/lib/nova/instances/7030a3a1-028c-4af8-a8bf-008a08a52227_del complete#033[00m
Dec 11 01:23:52 np0005554845 podman[226654]: 2025-12-11 06:23:52.318324469 +0000 UTC m=+0.046477603 container remove 9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true)
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.324 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[c2445e1c-7a24-42ce-8022-7374833eedc9]: (4, ('Thu Dec 11 06:23:52 AM UTC 2025 Stopping container neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599 (9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2)\n9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2\nThu Dec 11 06:23:52 AM UTC 2025 Deleting container neutron-haproxy-ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599 (9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2)\n9d16f42fee1570c96cbbede6919b5ed12aac82b641363706ea81ed37bce12cf2\n', '', 0)) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.327 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[276839ce-08a5-4beb-b58e-4c46a6894f8f]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.328 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DelPortCommand(_result=None, port=tap30a3c425-f0, bridge=None, if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:23:52 np0005554845 kernel: tap30a3c425-f0: left promiscuous mode
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.330 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.337 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[38ba0d28-93e5-410e-a3aa-930cbf983600]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.343 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.353 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[24ac0a60-2c79-40a1-b9f5-7c5740f5c6ba]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.354 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[357a0c8e-db6a-425e-900e-2ba664ff9249]: (4, True) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.375 213683 DEBUG oslo.privsep.daemon [-] privsep: reply[8031421f-f42f-4020-b3e8-e17b21f7bda9]: (4, [{'family': 0, '__align': (), 'ifi_type': 772, 'index': 1, 'flags': 65609, 'change': 0, 'attrs': [['IFLA_IFNAME', 'lo'], ['IFLA_TXQLEN', 1000], ['IFLA_OPERSTATE', 'UNKNOWN'], ['IFLA_LINKMODE', 0], ['IFLA_MTU', 65536], ['IFLA_MIN_MTU', 0], ['IFLA_MAX_MTU', 0], ['IFLA_GROUP', 0], ['IFLA_PROMISCUITY', 0], ['UNKNOWN', {'header': {'length': 8, 'type': 61}}], ['IFLA_NUM_TX_QUEUES', 1], ['IFLA_GSO_MAX_SEGS', 65535], ['IFLA_GSO_MAX_SIZE', 65536], ['IFLA_GRO_MAX_SIZE', 65536], ['UNKNOWN', {'header': {'length': 8, 'type': 63}}], ['UNKNOWN', {'header': {'length': 8, 'type': 64}}], ['IFLA_TSO_MAX_SIZE', 524280], ['IFLA_TSO_MAX_SEGS', 65535], ['UNKNOWN', {'header': {'length': 8, 'type': 66}}], ['IFLA_NUM_RX_QUEUES', 1], ['IFLA_CARRIER', 1], ['IFLA_CARRIER_CHANGES', 0], ['IFLA_CARRIER_UP_COUNT', 0], ['IFLA_CARRIER_DOWN_COUNT', 0], ['IFLA_PROTO_DOWN', 0], ['IFLA_ADDRESS', '00:00:00:00:00:00'], ['IFLA_BROADCAST', '00:00:00:00:00:00'], ['IFLA_STATS64', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_STATS', {'rx_packets': 1, 'tx_packets': 1, 'rx_bytes': 28, 'tx_bytes': 28, 'rx_errors': 0, 'tx_errors': 0, 'rx_dropped': 0, 'tx_dropped': 0, 'multicast': 0, 'collisions': 0, 'rx_length_errors': 0, 'rx_over_errors': 0, 'rx_crc_errors': 0, 'rx_frame_errors': 0, 'rx_fifo_errors': 0, 'rx_missed_errors': 0, 'tx_aborted_errors': 0, 'tx_carrier_errors': 0, 'tx_fifo_errors': 0, 'tx_heartbeat_errors': 0, 'tx_window_errors': 0, 'rx_compressed': 0, 'tx_compressed': 0}], ['IFLA_XDP', {'attrs': [['IFLA_XDP_ATTACHED', None]]}], ['IFLA_QDISC', 'noqueue'], ['IFLA_AF_SPEC', {'attrs': [['AF_INET', {'dummy': 65668, 'forwarding': 1, 'mc_forwarding': 0, 'proxy_arp': 0, 'accept_redirects': 0, 'secure_redirects': 0, 'send_redirects': 0, 'shared_media': 1, 'rp_filter': 1, 'accept_source_route': 0, 'bootp_relay': 0, 'log_martians': 0, 'tag': 0, 'arpfilter': 0, 'medium_id': 0, 'noxfrm': 1, 'nopolicy': 1, 'force_igmp_version': 0, 'arp_announce': 0, 'arp_ignore': 0, 'promote_secondaries': 1, 'arp_accept': 0, 'arp_notify': 0, 'accept_local': 0, 'src_vmark': 0, 'proxy_arp_pvlan': 0, 'route_localnet': 0, 'igmpv2_unsolicited_report_interval': 10000, 'igmpv3_unsolicited_report_interval': 1000}], ['AF_INET6', {'attrs': [['IFLA_INET6_FLAGS', 2147483648], ['IFLA_INET6_CACHEINFO', {'max_reasm_len': 65535, 'tstamp': 444626, 'reachable_time': 25091, 'retrans_time': 1000}], ['IFLA_INET6_CONF', {'forwarding': 0, 'hop_limit': 64, 'mtu': 65536, 'accept_ra': 1, 'accept_redirects': 1, 'autoconf': 1, 'dad_transmits': 1, 'router_solicitations': 4294967295, 'router_solicitation_interval': 4000, 'router_solicitation_delay': 1000, 'use_tempaddr': 4294967295, 'temp_valid_lft': 604800, 'temp_preferred_lft': 86400, 'regen_max_retry': 3, 'max_desync_factor': 600, 'max_addresses': 16, 'force_mld_version': 0, 'accept_ra_defrtr': 1, 'accept_ra_pinfo': 1, 'accept_ra_rtr_pref': 1, 'router_probe_interval': 60000, 'accept_ra_rt_info_max_plen': 0, 'proxy_ndp': 0, 'optimistic_dad': 0, 'accept_source_route': 0, 'mc_forwarding': 0, 'disable_ipv6': 0, 'accept_dad': 4294967295, 'force_tllao': 0, 'ndisc_notify': 0}], ['IFLA_INET6_STATS', {'num': 38, 'inpkts': 0, 'inoctets': 0, 'indelivers': 0, 'outforwdatagrams': 0, 'outpkts': 0, 'outoctets': 0, 'inhdrerrors': 0, 'intoobigerrors': 0, 'innoroutes': 0, 'inaddrerrors': 0, 'inunknownprotos': 0, 'intruncatedpkts': 0, 'indiscards': 0, 'outdiscards': 0, 'outnoroutes': 0, 'reasmtimeout': 0, 'reasmreqds': 0, 'reasmoks': 0, 'reasmfails': 0, 'fragoks': 0, 'fragfails': 0, 'fragcreates': 0, 'inmcastpkts': 0, 'outmcastpkts': 0, 'inbcastpkts': 0, 'outbcastpkts': 0, 'inmcastoctets': 0, 'outmcastoctets': 0, 'inbcastoctets': 0, 'outbcastoctets': 0, 'csumerrors': 0, 'noectpkts': 0, 'ect1pkts': 0, 'ect0pkts': 0, 'cepkts': 0}], ['IFLA_INET6_ICMP6STATS', {'num': 7, 'inmsgs': 0, 'inerrors': 0, 'outmsgs': 0, 'outerrors': 0, 'csumerrors': 0}], ['IFLA_INET6_TOKEN', '::'], ['IFLA_INET6_ADDR_GEN_MODE', 0]]}]]}], ['IFLA_MAP', {'mem_start': 0, 'mem_end': 0, 'base_addr': 0, 'irq': 0, 'dma': 0, 'port': 0}], ['UNKNOWN', {'header': {'length': 4, 'type': 32830}}], ['UNKNOWN', {'header': {'length': 4, 'type': 32833}}]], 'header': {'length': 1404, 'type': 16, 'flags': 2, 'sequence_number': 255, 'pid': 226673, 'error': None, 'target': 'ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599', 'stats': (0, 0, 0)}, 'state': 'up', 'event': 'RTM_NEWLINK'}]) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 systemd[1]: run-netns-ovnmeta\x2d30a3c425\x2dff4d\x2d4be0\x2db139\x2d7bb4a9781599.mount: Deactivated successfully.
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.380 104433 DEBUG neutron.privileged.agent.linux.ip_lib [-] Namespace ovnmeta-30a3c425-ff4d-4be0-b139-7bb4a9781599 deleted. remove_netns /usr/lib/python3.9/site-packages/neutron/privileged/agent/linux/ip_lib.py:607#033[00m
Dec 11 01:23:52 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:52.380 104433 DEBUG oslo.privsep.daemon [-] privsep: reply[f7ce0951-90ff-4326-81e7-c50cbd5e4f45]: (4, None) _call_back /usr/lib/python3.9/site-packages/oslo_privsep/daemon.py:501#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.551 187132 INFO nova.compute.manager [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Took 0.58 seconds to destroy the instance on the hypervisor.#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.552 187132 DEBUG oslo.service.loopingcall [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Waiting for function nova.compute.manager.ComputeManager._try_deallocate_network.<locals>._deallocate_network_with_retries to return. func /usr/lib/python3.9/site-packages/oslo_service/loopingcall.py:435#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.552 187132 DEBUG nova.compute.manager [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Deallocating network for instance _deallocate_network /usr/lib/python3.9/site-packages/nova/compute/manager.py:2259#033[00m
Dec 11 01:23:52 np0005554845 nova_compute[187128]: 2025-12-11 06:23:52.553 187132 DEBUG nova.network.neutron [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] deallocate_for_instance() deallocate_for_instance /usr/lib/python3.9/site-packages/nova/network/neutron.py:1803#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.856 187132 DEBUG nova.compute.manager [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-unplugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.856 187132 DEBUG oslo_concurrency.lockutils [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.857 187132 DEBUG oslo_concurrency.lockutils [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.858 187132 DEBUG oslo_concurrency.lockutils [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.858 187132 DEBUG nova.compute.manager [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] No waiting events found dispatching network-vif-unplugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:23:53 np0005554845 nova_compute[187128]: 2025-12-11 06:23:53.859 187132 DEBUG nova.compute.manager [req-85422f5c-b798-48a0-a525-26732aaba3d8 req-444fa191-4b38-4ea6-94fd-00218d5fd2ff eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-unplugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 for instance with task_state deleting. _process_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10826#033[00m
Dec 11 01:23:54 np0005554845 nova_compute[187128]: 2025-12-11 06:23:54.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.646 187132 DEBUG nova.network.neutron [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.678 187132 INFO nova.compute.manager [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Took 3.13 seconds to deallocate network for instance.#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.729 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.update_usage" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.730 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.780 187132 DEBUG nova.network.neutron [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updated VIF entry in instance network info cache for port 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6. _build_network_info_model /usr/lib/python3.9/site-packages/nova/network/neutron.py:3482#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.781 187132 DEBUG nova.network.neutron [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [{"id": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "address": "fa:16:3e:e3:89:e4", "network": {"id": "30a3c425-ff4d-4be0-b139-7bb4a9781599", "bridge": "br-int", "label": "tempest-network-smoke--1438918693", "subnets": [{"cidr": "10.100.0.0/28", "dns": [], "gateway": {"address": "10.100.0.1", "type": "gateway", "version": 4, "meta": {}}, "ips": [{"address": "10.100.0.3", "type": "fixed", "version": 4, "meta": {}, "floating_ips": []}], "routes": [], "version": 4, "meta": {"enable_dhcp": true}}], "meta": {"injected": false, "tenant_id": "9d8630abd3cd4aef89d0b1af6e62ac93", "mtu": 1442, "physical_network": null, "tunneled": true}}, "type": "ovs", "details": {"port_filter": true, "connectivity": "l2", "bridge_name": "br-int", "datapath_type": "system", "bound_drivers": {"0": "ovn"}}, "devname": "tap7b9f5b7c-eb", "ovs_interfaceid": "7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6", "qbh_params": null, "qbg_params": null, "active": true, "vnic_type": "normal", "profile": {}, "preserve_on_delete": false, "delegate_create": true, "meta": {}}] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.804 187132 DEBUG oslo_concurrency.lockutils [req-60d6893a-edcf-4508-8ee3-5f8aa0834aa1 req-631e75d8-ed4b-4e64-a939-307ef0ccd4a9 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Releasing lock "refresh_cache-7030a3a1-028c-4af8-a8bf-008a08a52227" lock /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:333#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.815 187132 DEBUG nova.compute.provider_tree [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.833 187132 DEBUG nova.scheduler.client.report [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.853 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.update_usage" :: held 0.123s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.906 187132 INFO nova.scheduler.client.report [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Deleted allocations for instance 7030a3a1-028c-4af8-a8bf-008a08a52227#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.949 187132 DEBUG nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.949 187132 DEBUG oslo_concurrency.lockutils [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Acquiring lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.950 187132 DEBUG oslo_concurrency.lockutils [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" acquired by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.950 187132 DEBUG oslo_concurrency.lockutils [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227-events" "released" by "nova.compute.manager.InstanceEvents.pop_instance_event.<locals>._pop_event" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.951 187132 DEBUG nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] No waiting events found dispatching network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 pop_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:320#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.951 187132 WARNING nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received unexpected event network-vif-plugged-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 for instance with vm_state deleted and task_state None.#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.951 187132 DEBUG nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Received event network-vif-deleted-7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6 external_instance_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:11048#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.952 187132 INFO nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Neutron deleted interface 7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6; detaching it from the instance and deleting it from the info cache#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.952 187132 DEBUG nova.network.neutron [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Updating instance_info_cache with network_info: [] update_instance_cache_with_nw_info /usr/lib/python3.9/site-packages/nova/network/neutron.py:116#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.984 187132 DEBUG oslo_concurrency.lockutils [None req-b63944b0-ff1f-484e-9b59-1f94ed6340ba 78548cbaea0e406ebb716882c382c954 9d8630abd3cd4aef89d0b1af6e62ac93 - - default default] Lock "7030a3a1-028c-4af8-a8bf-008a08a52227" "released" by "nova.compute.manager.ComputeManager.terminate_instance.<locals>.do_terminate_instance" :: held 4.015s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:23:55 np0005554845 nova_compute[187128]: 2025-12-11 06:23:55.989 187132 DEBUG nova.compute.manager [req-602d29ca-de7d-4901-88d1-79a17e1c6b50 req-d7d7eaba-faf1-47dc-971b-3577350da517 eb0829b666f44ab1a74bab50d3268066 c262b6ddd8524737a0fb1a6a9ed1740e - - default default] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Detach interface failed, port_id=7b9f5b7c-ebce-4c72-b5ab-13f7d4a281d6, reason: Instance 7030a3a1-028c-4af8-a8bf-008a08a52227 could not be found. _process_instance_vif_deleted_event /usr/lib/python3.9/site-packages/nova/compute/manager.py:10882#033[00m
Dec 11 01:23:56 np0005554845 nova_compute[187128]: 2025-12-11 06:23:56.307 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:57 np0005554845 nova_compute[187128]: 2025-12-11 06:23:57.037 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:57.036 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=22, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=21) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:23:57 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:23:57.039 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 3 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:23:57 np0005554845 nova_compute[187128]: 2025-12-11 06:23:57.286 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:23:58 np0005554845 nova_compute[187128]: 2025-12-11 06:23:58.386 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:23:58 np0005554845 nova_compute[187128]: 2025-12-11 06:23:58.386 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:24:00 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:00.042 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '22'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:24:00 np0005554845 nova_compute[187128]: 2025-12-11 06:24:00.602 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:00 np0005554845 nova_compute[187128]: 2025-12-11 06:24:00.750 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:01 np0005554845 nova_compute[187128]: 2025-12-11 06:24:01.308 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:02 np0005554845 nova_compute[187128]: 2025-12-11 06:24:02.289 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:06 np0005554845 podman[226675]: 2025-12-11 06:24:06.146044616 +0000 UTC m=+0.068023949 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:24:06 np0005554845 nova_compute[187128]: 2025-12-11 06:24:06.310 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:07 np0005554845 nova_compute[187128]: 2025-12-11 06:24:07.263 187132 DEBUG nova.virt.driver [-] Emitting event <LifecycleEvent: 1765434232.2606773, 7030a3a1-028c-4af8-a8bf-008a08a52227 => Stopped> emit_event /usr/lib/python3.9/site-packages/nova/virt/driver.py:1653#033[00m
Dec 11 01:24:07 np0005554845 nova_compute[187128]: 2025-12-11 06:24:07.263 187132 INFO nova.compute.manager [-] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] VM Stopped (Lifecycle Event)#033[00m
Dec 11 01:24:07 np0005554845 nova_compute[187128]: 2025-12-11 06:24:07.287 187132 DEBUG nova.compute.manager [None req-82e970ca-a8ae-4bae-9190-cd5cf2739188 - - - - - -] [instance: 7030a3a1-028c-4af8-a8bf-008a08a52227] Checking state _get_power_state /usr/lib/python3.9/site-packages/nova/compute/manager.py:1762#033[00m
Dec 11 01:24:07 np0005554845 nova_compute[187128]: 2025-12-11 06:24:07.334 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:11 np0005554845 nova_compute[187128]: 2025-12-11 06:24:11.311 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:12 np0005554845 podman[226699]: 2025-12-11 06:24:12.160976824 +0000 UTC m=+0.084887277 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, managed_by=edpm_ansible)
Dec 11 01:24:12 np0005554845 nova_compute[187128]: 2025-12-11 06:24:12.336 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:14 np0005554845 podman[226720]: 2025-12-11 06:24:14.150588279 +0000 UTC m=+0.073993911 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, container_name=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 11 01:24:14 np0005554845 podman[226721]: 2025-12-11 06:24:14.159644584 +0000 UTC m=+0.088393042 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller)
Dec 11 01:24:16 np0005554845 nova_compute[187128]: 2025-12-11 06:24:16.313 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:17 np0005554845 nova_compute[187128]: 2025-12-11 06:24:17.338 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:18 np0005554845 podman[226764]: 2025-12-11 06:24:18.131632139 +0000 UTC m=+0.064987535 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd)
Dec 11 01:24:21 np0005554845 nova_compute[187128]: 2025-12-11 06:24:21.315 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:22 np0005554845 podman[226784]: 2025-12-11 06:24:22.128566552 +0000 UTC m=+0.061908021 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:24:22 np0005554845 podman[226785]: 2025-12-11 06:24:22.16529927 +0000 UTC m=+0.092484513 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, managed_by=edpm_ansible, com.redhat.component=ubi9-minimal-container, distribution-scope=public, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.openshift.expose-services=, container_name=openstack_network_exporter, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, build-date=2025-08-20T13:12:41, config_id=edpm, maintainer=Red Hat, Inc.)
Dec 11 01:24:22 np0005554845 nova_compute[187128]: 2025-12-11 06:24:22.340 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:26.236 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:24:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:26.236 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:24:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:26.236 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:24:26 np0005554845 nova_compute[187128]: 2025-12-11 06:24:26.317 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:27 np0005554845 nova_compute[187128]: 2025-12-11 06:24:27.342 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:31 np0005554845 nova_compute[187128]: 2025-12-11 06:24:31.318 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:31 np0005554845 ovn_controller[95428]: 2025-12-11T06:24:31Z|00335|memory_trim|INFO|Detected inactivity (last active 30010 ms ago): trimming memory
Dec 11 01:24:32 np0005554845 nova_compute[187128]: 2025-12-11 06:24:32.344 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:36 np0005554845 nova_compute[187128]: 2025-12-11 06:24:36.321 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:36 np0005554845 nova_compute[187128]: 2025-12-11 06:24:36.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:37 np0005554845 podman[226828]: 2025-12-11 06:24:37.155040511 +0000 UTC m=+0.083037857 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:24:37 np0005554845 nova_compute[187128]: 2025-12-11 06:24:37.345 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:38 np0005554845 nova_compute[187128]: 2025-12-11 06:24:38.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:38 np0005554845 nova_compute[187128]: 2025-12-11 06:24:38.885 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:24:38 np0005554845 nova_compute[187128]: 2025-12-11 06:24:38.886 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:24:38 np0005554845 nova_compute[187128]: 2025-12-11 06:24:38.886 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:24:38 np0005554845 nova_compute[187128]: 2025-12-11 06:24:38.886 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.105 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.106 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5746MB free_disk=73.28395462036133GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.106 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.107 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.182 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.183 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.240 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.260 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.283 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:24:39 np0005554845 nova_compute[187128]: 2025-12-11 06:24:39.284 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.177s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:24:41 np0005554845 nova_compute[187128]: 2025-12-11 06:24:41.322 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.280 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.281 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.281 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.281 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.346 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:42.474 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=23, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=22) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:24:42 np0005554845 nova_compute[187128]: 2025-12-11 06:24:42.474 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:42.475 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 1 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:24:43 np0005554845 podman[226853]: 2025-12-11 06:24:43.145448389 +0000 UTC m=+0.070377655 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:24:43 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:24:43.478 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '23'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.273 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.273 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.274 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.274 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.275 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.275 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:44 np0005554845 nova_compute[187128]: 2025-12-11 06:24:44.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:24:45 np0005554845 podman[226873]: 2025-12-11 06:24:45.120007098 +0000 UTC m=+0.051311933 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3)
Dec 11 01:24:45 np0005554845 podman[226874]: 2025-12-11 06:24:45.156728606 +0000 UTC m=+0.086796547 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:24:46 np0005554845 nova_compute[187128]: 2025-12-11 06:24:46.325 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:47 np0005554845 nova_compute[187128]: 2025-12-11 06:24:47.348 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:49 np0005554845 podman[226915]: 2025-12-11 06:24:49.154959264 +0000 UTC m=+0.080390915 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, config_id=multipathd, org.label-schema.vendor=CentOS, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, managed_by=edpm_ansible)
Dec 11 01:24:51 np0005554845 nova_compute[187128]: 2025-12-11 06:24:51.327 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:52 np0005554845 nova_compute[187128]: 2025-12-11 06:24:52.351 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:53 np0005554845 podman[226937]: 2025-12-11 06:24:53.130905081 +0000 UTC m=+0.061165987 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, architecture=x86_64, name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, vcs-type=git, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vendor=Red Hat, Inc., io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., release=1755695350)
Dec 11 01:24:53 np0005554845 podman[226936]: 2025-12-11 06:24:53.138422393 +0000 UTC m=+0.067567059 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:24:56 np0005554845 nova_compute[187128]: 2025-12-11 06:24:56.329 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:24:57 np0005554845 nova_compute[187128]: 2025-12-11 06:24:57.354 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:01 np0005554845 nova_compute[187128]: 2025-12-11 06:25:01.330 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:02 np0005554845 nova_compute[187128]: 2025-12-11 06:25:02.356 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:06 np0005554845 nova_compute[187128]: 2025-12-11 06:25:06.333 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:07 np0005554845 nova_compute[187128]: 2025-12-11 06:25:07.358 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:08 np0005554845 podman[226979]: 2025-12-11 06:25:08.144898304 +0000 UTC m=+0.070968801 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:25:11 np0005554845 nova_compute[187128]: 2025-12-11 06:25:11.336 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:12 np0005554845 nova_compute[187128]: 2025-12-11 06:25:12.360 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:14 np0005554845 podman[227004]: 2025-12-11 06:25:14.133942349 +0000 UTC m=+0.062359900 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute)
Dec 11 01:25:16 np0005554845 podman[227024]: 2025-12-11 06:25:16.130591002 +0000 UTC m=+0.064027785 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:25:16 np0005554845 podman[227025]: 2025-12-11 06:25:16.196528117 +0000 UTC m=+0.115416099 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:25:16 np0005554845 nova_compute[187128]: 2025-12-11 06:25:16.338 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:17 np0005554845 nova_compute[187128]: 2025-12-11 06:25:17.362 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:20 np0005554845 podman[227065]: 2025-12-11 06:25:20.128194813 +0000 UTC m=+0.057460198 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd)
Dec 11 01:25:21 np0005554845 nova_compute[187128]: 2025-12-11 06:25:21.341 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:22 np0005554845 nova_compute[187128]: 2025-12-11 06:25:22.364 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:24 np0005554845 podman[227085]: 2025-12-11 06:25:24.11449546 +0000 UTC m=+0.051175240 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:25:24 np0005554845 podman[227086]: 2025-12-11 06:25:24.133234444 +0000 UTC m=+0.065218967 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, name=ubi9-minimal, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-type=git, container_name=openstack_network_exporter, config_id=edpm, io.openshift.expose-services=, release=1755695350, architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:25:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:25:26.237 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:25:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:25:26.238 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:25:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:25:26.238 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:25:26 np0005554845 nova_compute[187128]: 2025-12-11 06:25:26.343 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:27 np0005554845 nova_compute[187128]: 2025-12-11 06:25:27.367 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.101 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:25:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:25:31 np0005554845 nova_compute[187128]: 2025-12-11 06:25:31.361 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:32 np0005554845 nova_compute[187128]: 2025-12-11 06:25:32.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:34 np0005554845 ovn_controller[95428]: 2025-12-11T06:25:34Z|00336|memory_trim|INFO|Detected inactivity (last active 30011 ms ago): trimming memory
Dec 11 01:25:36 np0005554845 nova_compute[187128]: 2025-12-11 06:25:36.363 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:36 np0005554845 nova_compute[187128]: 2025-12-11 06:25:36.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:37 np0005554845 nova_compute[187128]: 2025-12-11 06:25:37.372 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:39 np0005554845 podman[227127]: 2025-12-11 06:25:39.1451245 +0000 UTC m=+0.070890629 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.722 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.722 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.723 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.870 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.872 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5749MB free_disk=73.28400039672852GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.872 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:25:40 np0005554845 nova_compute[187128]: 2025-12-11 06:25:40.872 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.067 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.067 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.086 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.105 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.105 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.122 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.141 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.161 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.174 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.176 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.176 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.304s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:25:41 np0005554845 nova_compute[187128]: 2025-12-11 06:25:41.365 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:42 np0005554845 nova_compute[187128]: 2025-12-11 06:25:42.374 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.174 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.175 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.175 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.175 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.188 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.189 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:43 np0005554845 nova_compute[187128]: 2025-12-11 06:25:43.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:44 np0005554845 nova_compute[187128]: 2025-12-11 06:25:44.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:45 np0005554845 podman[227152]: 2025-12-11 06:25:45.139359783 +0000 UTC m=+0.074613839 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:25:45 np0005554845 nova_compute[187128]: 2025-12-11 06:25:45.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:25:45 np0005554845 nova_compute[187128]: 2025-12-11 06:25:45.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:25:46 np0005554845 nova_compute[187128]: 2025-12-11 06:25:46.366 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:47 np0005554845 podman[227172]: 2025-12-11 06:25:47.140332162 +0000 UTC m=+0.063307734 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS)
Dec 11 01:25:47 np0005554845 podman[227173]: 2025-12-11 06:25:47.152304365 +0000 UTC m=+0.084767793 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, managed_by=edpm_ansible, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:25:47 np0005554845 nova_compute[187128]: 2025-12-11 06:25:47.377 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:51 np0005554845 podman[227217]: 2025-12-11 06:25:51.126664381 +0000 UTC m=+0.060843649 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:25:51 np0005554845 nova_compute[187128]: 2025-12-11 06:25:51.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:52 np0005554845 nova_compute[187128]: 2025-12-11 06:25:52.378 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:55 np0005554845 podman[227239]: 2025-12-11 06:25:55.135229707 +0000 UTC m=+0.056940183 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:25:55 np0005554845 podman[227240]: 2025-12-11 06:25:55.165111262 +0000 UTC m=+0.076868480 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., version=9.6, build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter, name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, distribution-scope=public, architecture=x86_64, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7)
Dec 11 01:25:56 np0005554845 nova_compute[187128]: 2025-12-11 06:25:56.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:25:57 np0005554845 nova_compute[187128]: 2025-12-11 06:25:57.380 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:00 np0005554845 systemd-logind[789]: New session 28 of user zuul.
Dec 11 01:26:00 np0005554845 systemd[1]: Started Session 28 of User zuul.
Dec 11 01:26:01 np0005554845 nova_compute[187128]: 2025-12-11 06:26:01.370 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:02 np0005554845 nova_compute[187128]: 2025-12-11 06:26:02.383 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:06 np0005554845 nova_compute[187128]: 2025-12-11 06:26:06.373 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:07 np0005554845 nova_compute[187128]: 2025-12-11 06:26:07.384 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:08 np0005554845 ovs-vsctl[227503]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 11 01:26:08 np0005554845 systemd[1]: proc-sys-fs-binfmt_misc.automount: Got automount request for /proc/sys/fs/binfmt_misc, triggered by 227316 (sos)
Dec 11 01:26:08 np0005554845 systemd[1]: Mounting Arbitrary Executable File Formats File System...
Dec 11 01:26:08 np0005554845 systemd[1]: Mounted Arbitrary Executable File Formats File System.
Dec 11 01:26:09 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 11 01:26:09 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 11 01:26:09 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 01:26:09 np0005554845 podman[227705]: 2025-12-11 06:26:09.524512795 +0000 UTC m=+0.061609380 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:26:11 np0005554845 nova_compute[187128]: 2025-12-11 06:26:11.374 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:12 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 01:26:12 np0005554845 nova_compute[187128]: 2025-12-11 06:26:12.386 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:12 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 01:26:16 np0005554845 podman[228430]: 2025-12-11 06:26:16.191343402 +0000 UTC m=+0.107126735 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:26:16 np0005554845 nova_compute[187128]: 2025-12-11 06:26:16.376 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:17 np0005554845 nova_compute[187128]: 2025-12-11 06:26:17.389 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:18 np0005554845 podman[228726]: 2025-12-11 06:26:18.178230132 +0000 UTC m=+0.094663329 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:26:18 np0005554845 podman[228733]: 2025-12-11 06:26:18.21753785 +0000 UTC m=+0.132369314 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, container_name=ovn_controller, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:26:19 np0005554845 ovs-appctl[229300]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 01:26:19 np0005554845 ovs-appctl[229312]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 01:26:19 np0005554845 ovs-appctl[229326]: ovs|00001|daemon_unix|WARN|/var/run/openvswitch/ovs-monitor-ipsec.pid: open: No such file or directory
Dec 11 01:26:21 np0005554845 nova_compute[187128]: 2025-12-11 06:26:21.377 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:22 np0005554845 podman[230216]: 2025-12-11 06:26:22.145555548 +0000 UTC m=+0.070406486 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=multipathd, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:26:22 np0005554845 nova_compute[187128]: 2025-12-11 06:26:22.391 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:25 np0005554845 podman[230437]: 2025-12-11 06:26:25.837966125 +0000 UTC m=+0.057502359 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:26:25 np0005554845 podman[230439]: 2025-12-11 06:26:25.844250224 +0000 UTC m=+0.072079221 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vendor=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, config_id=edpm, container_name=openstack_network_exporter, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, maintainer=Red Hat, Inc., name=ubi9-minimal, version=9.6, vcs-type=git, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, managed_by=edpm_ansible, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, distribution-scope=public, architecture=x86_64, url=https://catalog.redhat.com/en/search?searchType=containers)
Dec 11 01:26:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:26:26.239 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:26:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:26:26.239 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:26:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:26:26.240 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:26:26 np0005554845 nova_compute[187128]: 2025-12-11 06:26:26.377 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:26 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 01:26:27 np0005554845 nova_compute[187128]: 2025-12-11 06:26:27.393 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:28 np0005554845 systemd[1]: Starting Time & Date Service...
Dec 11 01:26:28 np0005554845 systemd[1]: Started Time & Date Service.
Dec 11 01:26:31 np0005554845 nova_compute[187128]: 2025-12-11 06:26:31.380 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:32 np0005554845 nova_compute[187128]: 2025-12-11 06:26:32.396 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:36 np0005554845 nova_compute[187128]: 2025-12-11 06:26:36.379 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:36 np0005554845 nova_compute[187128]: 2025-12-11 06:26:36.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:37 np0005554845 nova_compute[187128]: 2025-12-11 06:26:37.398 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:40 np0005554845 podman[230913]: 2025-12-11 06:26:40.201224941 +0000 UTC m=+0.116276560 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:26:41 np0005554845 nova_compute[187128]: 2025-12-11 06:26:41.383 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.401 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.686 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.723 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.724 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.878 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.880 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5461MB free_disk=72.86772918701172GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.880 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.880 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.965 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.965 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:26:42 np0005554845 nova_compute[187128]: 2025-12-11 06:26:42.983 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:26:43 np0005554845 nova_compute[187128]: 2025-12-11 06:26:43.002 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:26:43 np0005554845 nova_compute[187128]: 2025-12-11 06:26:43.025 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:26:43 np0005554845 nova_compute[187128]: 2025-12-11 06:26:43.025 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.145s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.027 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.028 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.028 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.049 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.050 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.050 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:45 np0005554845 nova_compute[187128]: 2025-12-11 06:26:45.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:46 np0005554845 nova_compute[187128]: 2025-12-11 06:26:46.384 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:47 np0005554845 podman[230937]: 2025-12-11 06:26:47.154131059 +0000 UTC m=+0.076491770 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm)
Dec 11 01:26:47 np0005554845 nova_compute[187128]: 2025-12-11 06:26:47.404 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:47 np0005554845 nova_compute[187128]: 2025-12-11 06:26:47.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:47 np0005554845 nova_compute[187128]: 2025-12-11 06:26:47.721 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:26:47 np0005554845 nova_compute[187128]: 2025-12-11 06:26:47.721 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:26:48 np0005554845 podman[230958]: 2025-12-11 06:26:48.80112586 +0000 UTC m=+0.063136200 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, io.buildah.version=1.41.3)
Dec 11 01:26:48 np0005554845 podman[230959]: 2025-12-11 06:26:48.82304162 +0000 UTC m=+0.089238783 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, maintainer=OpenStack Kubernetes Operator team, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller)
Dec 11 01:26:49 np0005554845 systemd[1]: session-28.scope: Deactivated successfully.
Dec 11 01:26:49 np0005554845 systemd[1]: session-28.scope: Consumed 1min 18.098s CPU time, 517.8M memory peak, read 103.0M from disk, written 45.9M to disk.
Dec 11 01:26:49 np0005554845 systemd-logind[789]: Session 28 logged out. Waiting for processes to exit.
Dec 11 01:26:49 np0005554845 systemd-logind[789]: Removed session 28.
Dec 11 01:26:49 np0005554845 systemd-logind[789]: New session 29 of user zuul.
Dec 11 01:26:49 np0005554845 systemd[1]: Started Session 29 of User zuul.
Dec 11 01:26:50 np0005554845 systemd[1]: session-29.scope: Deactivated successfully.
Dec 11 01:26:50 np0005554845 systemd-logind[789]: Session 29 logged out. Waiting for processes to exit.
Dec 11 01:26:50 np0005554845 systemd-logind[789]: Removed session 29.
Dec 11 01:26:50 np0005554845 systemd-logind[789]: New session 30 of user zuul.
Dec 11 01:26:50 np0005554845 systemd[1]: Started Session 30 of User zuul.
Dec 11 01:26:50 np0005554845 systemd[1]: session-30.scope: Deactivated successfully.
Dec 11 01:26:50 np0005554845 systemd-logind[789]: Session 30 logged out. Waiting for processes to exit.
Dec 11 01:26:50 np0005554845 systemd-logind[789]: Removed session 30.
Dec 11 01:26:51 np0005554845 nova_compute[187128]: 2025-12-11 06:26:51.386 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:52 np0005554845 nova_compute[187128]: 2025-12-11 06:26:52.415 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:53 np0005554845 podman[231061]: 2025-12-11 06:26:53.136720159 +0000 UTC m=+0.069436881 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, config_id=multipathd, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd)
Dec 11 01:26:56 np0005554845 podman[231082]: 2025-12-11 06:26:56.146020008 +0000 UTC m=+0.077415875 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:26:56 np0005554845 podman[231083]: 2025-12-11 06:26:56.193229919 +0000 UTC m=+0.110694931 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, architecture=x86_64, maintainer=Red Hat, Inc., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, container_name=openstack_network_exporter, vendor=Red Hat, Inc., config_id=edpm, io.buildah.version=1.33.7, url=https://catalog.redhat.com/en/search?searchType=containers, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, io.openshift.tags=minimal rhel9, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, distribution-scope=public, managed_by=edpm_ansible, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-type=git, build-date=2025-08-20T13:12:41, version=9.6)
Dec 11 01:26:56 np0005554845 nova_compute[187128]: 2025-12-11 06:26:56.387 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:57 np0005554845 nova_compute[187128]: 2025-12-11 06:26:57.417 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:26:58 np0005554845 systemd[1]: systemd-timedated.service: Deactivated successfully.
Dec 11 01:26:58 np0005554845 systemd[1]: systemd-hostnamed.service: Deactivated successfully.
Dec 11 01:27:02 np0005554845 nova_compute[187128]: 2025-12-11 06:27:02.396 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:02 np0005554845 nova_compute[187128]: 2025-12-11 06:27:02.419 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:07 np0005554845 nova_compute[187128]: 2025-12-11 06:27:07.398 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:07 np0005554845 nova_compute[187128]: 2025-12-11 06:27:07.421 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:11 np0005554845 podman[231134]: 2025-12-11 06:27:11.164949874 +0000 UTC m=+0.081040902 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:27:12 np0005554845 nova_compute[187128]: 2025-12-11 06:27:12.399 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:12 np0005554845 nova_compute[187128]: 2025-12-11 06:27:12.421 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:17 np0005554845 nova_compute[187128]: 2025-12-11 06:27:17.404 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:17 np0005554845 nova_compute[187128]: 2025-12-11 06:27:17.424 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:18 np0005554845 podman[231158]: 2025-12-11 06:27:18.15833285 +0000 UTC m=+0.079093679 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, org.label-schema.build-date=20251202, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:27:19 np0005554845 podman[231178]: 2025-12-11 06:27:19.17106545 +0000 UTC m=+0.097685141 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true)
Dec 11 01:27:19 np0005554845 podman[231179]: 2025-12-11 06:27:19.255323748 +0000 UTC m=+0.174727155 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=ovn_controller, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.schema-version=1.0)
Dec 11 01:27:22 np0005554845 nova_compute[187128]: 2025-12-11 06:27:22.406 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:22 np0005554845 nova_compute[187128]: 2025-12-11 06:27:22.426 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:24 np0005554845 podman[231224]: 2025-12-11 06:27:24.124808128 +0000 UTC m=+0.060271514 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, managed_by=edpm_ansible, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:27:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:27:26.242 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:27:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:27:26.242 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:27:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:27:26.243 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:27:27 np0005554845 podman[231244]: 2025-12-11 06:27:27.125189737 +0000 UTC m=+0.055500665 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:27:27 np0005554845 podman[231245]: 2025-12-11 06:27:27.149171703 +0000 UTC m=+0.076266745 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, managed_by=edpm_ansible, url=https://catalog.redhat.com/en/search?searchType=containers, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, io.openshift.tags=minimal rhel9, vcs-type=git, io.buildah.version=1.33.7, distribution-scope=public, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, vendor=Red Hat, Inc.)
Dec 11 01:27:27 np0005554845 nova_compute[187128]: 2025-12-11 06:27:27.407 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:27 np0005554845 nova_compute[187128]: 2025-12-11 06:27:27.427 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:27:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:27:32 np0005554845 nova_compute[187128]: 2025-12-11 06:27:32.409 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:32 np0005554845 nova_compute[187128]: 2025-12-11 06:27:32.429 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:37 np0005554845 nova_compute[187128]: 2025-12-11 06:27:37.411 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:37 np0005554845 nova_compute[187128]: 2025-12-11 06:27:37.431 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:37 np0005554845 nova_compute[187128]: 2025-12-11 06:27:37.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:42 np0005554845 podman[231290]: 2025-12-11 06:27:42.126240013 +0000 UTC m=+0.055391873 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:27:42 np0005554845 nova_compute[187128]: 2025-12-11 06:27:42.414 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:42 np0005554845 nova_compute[187128]: 2025-12-11 06:27:42.433 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.767 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.768 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.769 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.769 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.920 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.921 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5654MB free_disk=73.28378295898438GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.921 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:27:43 np0005554845 nova_compute[187128]: 2025-12-11 06:27:43.921 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.077 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.078 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.106 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.149 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.188 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:27:44 np0005554845 nova_compute[187128]: 2025-12-11 06:27:44.189 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.267s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.190 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.191 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.191 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.216 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.217 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.217 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:46 np0005554845 nova_compute[187128]: 2025-12-11 06:27:46.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:47 np0005554845 nova_compute[187128]: 2025-12-11 06:27:47.417 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:47 np0005554845 nova_compute[187128]: 2025-12-11 06:27:47.435 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:49 np0005554845 podman[231314]: 2025-12-11 06:27:49.114176802 +0000 UTC m=+0.053394158 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:27:49 np0005554845 nova_compute[187128]: 2025-12-11 06:27:49.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:27:49 np0005554845 nova_compute[187128]: 2025-12-11 06:27:49.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:27:50 np0005554845 podman[231335]: 2025-12-11 06:27:50.1201686 +0000 UTC m=+0.052512075 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_metadata_agent)
Dec 11 01:27:50 np0005554845 podman[231336]: 2025-12-11 06:27:50.172507259 +0000 UTC m=+0.103216719 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, io.buildah.version=1.41.3, config_id=ovn_controller, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:27:52 np0005554845 nova_compute[187128]: 2025-12-11 06:27:52.419 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:52 np0005554845 nova_compute[187128]: 2025-12-11 06:27:52.436 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:55 np0005554845 podman[231380]: 2025-12-11 06:27:55.131228129 +0000 UTC m=+0.068430134 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251202)
Dec 11 01:27:57 np0005554845 nova_compute[187128]: 2025-12-11 06:27:57.421 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:57 np0005554845 nova_compute[187128]: 2025-12-11 06:27:57.438 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:27:58 np0005554845 podman[231402]: 2025-12-11 06:27:58.129126101 +0000 UTC m=+0.057050347 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.expose-services=, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, release=1755695350, managed_by=edpm_ansible, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., container_name=openstack_network_exporter, version=9.6, config_id=edpm, architecture=x86_64, build-date=2025-08-20T13:12:41, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, name=ubi9-minimal)
Dec 11 01:27:58 np0005554845 podman[231401]: 2025-12-11 06:27:58.141526655 +0000 UTC m=+0.074508256 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:28:02 np0005554845 nova_compute[187128]: 2025-12-11 06:28:02.439 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:02 np0005554845 nova_compute[187128]: 2025-12-11 06:28:02.890 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:07 np0005554845 nova_compute[187128]: 2025-12-11 06:28:07.440 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:07 np0005554845 nova_compute[187128]: 2025-12-11 06:28:07.892 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:12 np0005554845 nova_compute[187128]: 2025-12-11 06:28:12.444 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:12 np0005554845 nova_compute[187128]: 2025-12-11 06:28:12.894 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:13 np0005554845 podman[231445]: 2025-12-11 06:28:13.112173161 +0000 UTC m=+0.049747799 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:28:17 np0005554845 nova_compute[187128]: 2025-12-11 06:28:17.445 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:17 np0005554845 nova_compute[187128]: 2025-12-11 06:28:17.896 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:20 np0005554845 podman[231470]: 2025-12-11 06:28:20.1227029 +0000 UTC m=+0.057303624 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']})
Dec 11 01:28:20 np0005554845 podman[231490]: 2025-12-11 06:28:20.233213274 +0000 UTC m=+0.060796867 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 11 01:28:20 np0005554845 podman[231510]: 2025-12-11 06:28:20.341012776 +0000 UTC m=+0.081908215 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, container_name=ovn_controller, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:28:22 np0005554845 nova_compute[187128]: 2025-12-11 06:28:22.449 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:22 np0005554845 nova_compute[187128]: 2025-12-11 06:28:22.898 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:26 np0005554845 podman[231536]: 2025-12-11 06:28:26.123475039 +0000 UTC m=+0.059548814 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, tcib_managed=true)
Dec 11 01:28:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:28:26.244 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:28:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:28:26.245 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:28:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:28:26.245 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:28:27 np0005554845 nova_compute[187128]: 2025-12-11 06:28:27.451 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:27 np0005554845 nova_compute[187128]: 2025-12-11 06:28:27.898 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:29 np0005554845 podman[231557]: 2025-12-11 06:28:29.129641464 +0000 UTC m=+0.062432811 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:28:29 np0005554845 podman[231558]: 2025-12-11 06:28:29.15065585 +0000 UTC m=+0.077954520 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.tags=minimal rhel9, build-date=2025-08-20T13:12:41, distribution-scope=public, architecture=x86_64, vendor=Red Hat, Inc., io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, container_name=openstack_network_exporter, io.buildah.version=1.33.7, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, url=https://catalog.redhat.com/en/search?searchType=containers, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, managed_by=edpm_ansible, vcs-type=git, version=9.6)
Dec 11 01:28:32 np0005554845 nova_compute[187128]: 2025-12-11 06:28:32.453 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:32 np0005554845 nova_compute[187128]: 2025-12-11 06:28:32.901 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:37 np0005554845 nova_compute[187128]: 2025-12-11 06:28:37.456 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:37 np0005554845 nova_compute[187128]: 2025-12-11 06:28:37.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:37 np0005554845 nova_compute[187128]: 2025-12-11 06:28:37.902 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:42 np0005554845 nova_compute[187128]: 2025-12-11 06:28:42.460 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:42 np0005554845 nova_compute[187128]: 2025-12-11 06:28:42.904 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.712 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.713 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.713 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.713 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.864 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.865 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5662MB free_disk=73.28379821777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.865 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.865 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.938 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:28:43 np0005554845 nova_compute[187128]: 2025-12-11 06:28:43.939 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:28:44 np0005554845 nova_compute[187128]: 2025-12-11 06:28:44.094 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:28:44 np0005554845 nova_compute[187128]: 2025-12-11 06:28:44.108 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:28:44 np0005554845 nova_compute[187128]: 2025-12-11 06:28:44.109 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:28:44 np0005554845 nova_compute[187128]: 2025-12-11 06:28:44.109 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.244s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:28:44 np0005554845 podman[231602]: 2025-12-11 06:28:44.118212593 +0000 UTC m=+0.052412511 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.111 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.111 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.111 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.128 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.128 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:46 np0005554845 nova_compute[187128]: 2025-12-11 06:28:46.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:47 np0005554845 nova_compute[187128]: 2025-12-11 06:28:47.463 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:47 np0005554845 nova_compute[187128]: 2025-12-11 06:28:47.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:47 np0005554845 nova_compute[187128]: 2025-12-11 06:28:47.905 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:51 np0005554845 podman[231625]: 2025-12-11 06:28:51.122429062 +0000 UTC m=+0.057532769 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.license=GPLv2, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:28:51 np0005554845 podman[231627]: 2025-12-11 06:28:51.130398121 +0000 UTC m=+0.055363501 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.license=GPLv2)
Dec 11 01:28:51 np0005554845 podman[231626]: 2025-12-11 06:28:51.159931436 +0000 UTC m=+0.090636927 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:28:51 np0005554845 nova_compute[187128]: 2025-12-11 06:28:51.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:51 np0005554845 nova_compute[187128]: 2025-12-11 06:28:51.703 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:51 np0005554845 nova_compute[187128]: 2025-12-11 06:28:51.703 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:28:52 np0005554845 nova_compute[187128]: 2025-12-11 06:28:52.466 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:52 np0005554845 nova_compute[187128]: 2025-12-11 06:28:52.908 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:56 np0005554845 nova_compute[187128]: 2025-12-11 06:28:56.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:56 np0005554845 nova_compute[187128]: 2025-12-11 06:28:56.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:28:56 np0005554845 nova_compute[187128]: 2025-12-11 06:28:56.708 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:28:57 np0005554845 podman[231686]: 2025-12-11 06:28:57.12620533 +0000 UTC m=+0.060411615 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:28:57 np0005554845 nova_compute[187128]: 2025-12-11 06:28:57.469 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:57 np0005554845 nova_compute[187128]: 2025-12-11 06:28:57.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:28:57 np0005554845 nova_compute[187128]: 2025-12-11 06:28:57.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:28:57 np0005554845 nova_compute[187128]: 2025-12-11 06:28:57.911 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:28:58 np0005554845 nova_compute[187128]: 2025-12-11 06:28:58.733 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_running_deleted_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:00 np0005554845 podman[231706]: 2025-12-11 06:29:00.12235883 +0000 UTC m=+0.054002496 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:29:00 np0005554845 podman[231707]: 2025-12-11 06:29:00.139296704 +0000 UTC m=+0.064187293 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, architecture=x86_64, distribution-scope=public, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vendor=Red Hat, Inc., config_id=edpm, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, release=1755695350, build-date=2025-08-20T13:12:41, io.openshift.expose-services=, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., url=https://catalog.redhat.com/en/search?searchType=containers, io.buildah.version=1.33.7, name=ubi9-minimal, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, container_name=openstack_network_exporter)
Dec 11 01:29:02 np0005554845 nova_compute[187128]: 2025-12-11 06:29:02.473 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:02 np0005554845 nova_compute[187128]: 2025-12-11 06:29:02.912 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:06 np0005554845 nova_compute[187128]: 2025-12-11 06:29:06.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:07 np0005554845 nova_compute[187128]: 2025-12-11 06:29:07.476 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:07 np0005554845 nova_compute[187128]: 2025-12-11 06:29:07.914 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:12 np0005554845 nova_compute[187128]: 2025-12-11 06:29:12.479 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:12 np0005554845 nova_compute[187128]: 2025-12-11 06:29:12.916 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:15 np0005554845 podman[231752]: 2025-12-11 06:29:15.11485603 +0000 UTC m=+0.047664471 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:29:17 np0005554845 nova_compute[187128]: 2025-12-11 06:29:17.481 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:17 np0005554845 nova_compute[187128]: 2025-12-11 06:29:17.918 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:22 np0005554845 podman[231777]: 2025-12-11 06:29:22.139958614 +0000 UTC m=+0.061161275 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 01:29:22 np0005554845 podman[231779]: 2025-12-11 06:29:22.156407614 +0000 UTC m=+0.069248716 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_id=edpm, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:29:22 np0005554845 podman[231778]: 2025-12-11 06:29:22.169891098 +0000 UTC m=+0.090866943 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:29:22 np0005554845 nova_compute[187128]: 2025-12-11 06:29:22.483 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:22 np0005554845 nova_compute[187128]: 2025-12-11 06:29:22.953 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:29:26.245 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:29:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:29:26.247 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:29:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:29:26.247 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:29:27 np0005554845 nova_compute[187128]: 2025-12-11 06:29:27.485 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:27 np0005554845 nova_compute[187128]: 2025-12-11 06:29:27.954 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:28 np0005554845 podman[231839]: 2025-12-11 06:29:28.125171044 +0000 UTC m=+0.056749688 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2)
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.102 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:29:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:29:31 np0005554845 podman[231859]: 2025-12-11 06:29:31.111443214 +0000 UTC m=+0.043279185 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:29:31 np0005554845 podman[231860]: 2025-12-11 06:29:31.126500149 +0000 UTC m=+0.053993726 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container, architecture=x86_64, distribution-scope=public, name=ubi9-minimal, release=1755695350, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.buildah.version=1.33.7, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., managed_by=edpm_ansible, vendor=Red Hat, Inc., maintainer=Red Hat, Inc., build-date=2025-08-20T13:12:41, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal)
Dec 11 01:29:32 np0005554845 nova_compute[187128]: 2025-12-11 06:29:32.487 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:32 np0005554845 nova_compute[187128]: 2025-12-11 06:29:32.956 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:37 np0005554845 nova_compute[187128]: 2025-12-11 06:29:37.490 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:37 np0005554845 nova_compute[187128]: 2025-12-11 06:29:37.956 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:38 np0005554845 nova_compute[187128]: 2025-12-11 06:29:38.706 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:42 np0005554845 nova_compute[187128]: 2025-12-11 06:29:42.493 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:42 np0005554845 nova_compute[187128]: 2025-12-11 06:29:42.959 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:43 np0005554845 nova_compute[187128]: 2025-12-11 06:29:43.688 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_power_states run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.753 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.754 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.755 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.755 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.959 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.960 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5676MB free_disk=73.28379821777344GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.961 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:29:44 np0005554845 nova_compute[187128]: 2025-12-11 06:29:44.961 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.041 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.041 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.127 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.149 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.151 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:29:45 np0005554845 nova_compute[187128]: 2025-12-11 06:29:45.151 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.191s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:29:46 np0005554845 podman[231901]: 2025-12-11 06:29:46.150395192 +0000 UTC m=+0.070065148 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:29:46 np0005554845 nova_compute[187128]: 2025-12-11 06:29:46.150 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:46 np0005554845 nova_compute[187128]: 2025-12-11 06:29:46.150 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:29:46 np0005554845 nova_compute[187128]: 2025-12-11 06:29:46.150 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:29:46 np0005554845 nova_compute[187128]: 2025-12-11 06:29:46.173 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:29:47 np0005554845 nova_compute[187128]: 2025-12-11 06:29:47.496 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:47 np0005554845 nova_compute[187128]: 2025-12-11 06:29:47.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:47 np0005554845 nova_compute[187128]: 2025-12-11 06:29:47.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:47 np0005554845 nova_compute[187128]: 2025-12-11 06:29:47.994 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:48 np0005554845 nova_compute[187128]: 2025-12-11 06:29:48.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:49 np0005554845 nova_compute[187128]: 2025-12-11 06:29:49.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:52 np0005554845 nova_compute[187128]: 2025-12-11 06:29:52.498 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:52 np0005554845 podman[231923]: 2025-12-11 06:29:52.598510648 +0000 UTC m=+0.071982268 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:29:52 np0005554845 podman[231925]: 2025-12-11 06:29:52.620166467 +0000 UTC m=+0.084143207 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:29:52 np0005554845 podman[231924]: 2025-12-11 06:29:52.689472863 +0000 UTC m=+0.149171531 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_id=ovn_controller)
Dec 11 01:29:52 np0005554845 nova_compute[187128]: 2025-12-11 06:29:52.993 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:53 np0005554845 nova_compute[187128]: 2025-12-11 06:29:53.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:29:53 np0005554845 nova_compute[187128]: 2025-12-11 06:29:53.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:29:57 np0005554845 nova_compute[187128]: 2025-12-11 06:29:57.500 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:57 np0005554845 nova_compute[187128]: 2025-12-11 06:29:57.996 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:29:59 np0005554845 podman[231986]: 2025-12-11 06:29:59.145347604 +0000 UTC m=+0.071322261 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:30:02 np0005554845 podman[232006]: 2025-12-11 06:30:02.264722203 +0000 UTC m=+0.076047994 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:30:02 np0005554845 podman[232007]: 2025-12-11 06:30:02.297934354 +0000 UTC m=+0.111805142 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, name=ubi9-minimal, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, release=1755695350, architecture=x86_64, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-type=git, config_id=edpm, container_name=openstack_network_exporter, vendor=Red Hat, Inc., version=9.6, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.component=ubi9-minimal-container, distribution-scope=public, build-date=2025-08-20T13:12:41, io.openshift.tags=minimal rhel9)
Dec 11 01:30:02 np0005554845 nova_compute[187128]: 2025-12-11 06:30:02.503 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:02 np0005554845 nova_compute[187128]: 2025-12-11 06:30:02.997 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:07 np0005554845 nova_compute[187128]: 2025-12-11 06:30:07.505 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:07 np0005554845 nova_compute[187128]: 2025-12-11 06:30:07.999 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:12 np0005554845 nova_compute[187128]: 2025-12-11 06:30:12.507 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:13 np0005554845 nova_compute[187128]: 2025-12-11 06:30:13.000 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:17 np0005554845 podman[232051]: 2025-12-11 06:30:17.183286917 +0000 UTC m=+0.105662341 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter)
Dec 11 01:30:17 np0005554845 nova_compute[187128]: 2025-12-11 06:30:17.509 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:18 np0005554845 nova_compute[187128]: 2025-12-11 06:30:18.003 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:22 np0005554845 nova_compute[187128]: 2025-12-11 06:30:22.513 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:23 np0005554845 nova_compute[187128]: 2025-12-11 06:30:23.005 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:23 np0005554845 podman[232076]: 2025-12-11 06:30:23.139832267 +0000 UTC m=+0.068413164 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:30:23 np0005554845 podman[232078]: 2025-12-11 06:30:23.160914541 +0000 UTC m=+0.076676142 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_id=edpm, container_name=ceilometer_agent_compute)
Dec 11 01:30:23 np0005554845 podman[232077]: 2025-12-11 06:30:23.171717713 +0000 UTC m=+0.096179472 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, container_name=ovn_controller, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202)
Dec 11 01:30:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:26.246 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:30:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:26.247 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:30:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:26.247 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:30:27 np0005554845 nova_compute[187128]: 2025-12-11 06:30:27.515 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:28 np0005554845 nova_compute[187128]: 2025-12-11 06:30:28.007 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:30 np0005554845 podman[232141]: 2025-12-11 06:30:30.137356768 +0000 UTC m=+0.075013847 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.schema-version=1.0, tcib_managed=true, container_name=multipathd, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']})
Dec 11 01:30:32 np0005554845 nova_compute[187128]: 2025-12-11 06:30:32.518 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:33 np0005554845 nova_compute[187128]: 2025-12-11 06:30:33.009 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:33 np0005554845 podman[232161]: 2025-12-11 06:30:33.125373963 +0000 UTC m=+0.061475183 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter)
Dec 11 01:30:33 np0005554845 podman[232162]: 2025-12-11 06:30:33.158958823 +0000 UTC m=+0.090808602 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, container_name=openstack_network_exporter, release=1755695350, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., managed_by=edpm_ansible, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, name=ubi9-minimal, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, vendor=Red Hat, Inc., distribution-scope=public, architecture=x86_64, io.openshift.expose-services=, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., com.redhat.component=ubi9-minimal-container)
Dec 11 01:30:36 np0005554845 nova_compute[187128]: 2025-12-11 06:30:36.611 187132 DEBUG oslo_concurrency.processutils [None req-317c34a9-b54d-4aa9-9f1f-5efb4dad6e5e 77e5fd8a8d4645f58602da9f89feb3a3 58891547c7294a57a183f092c2e8f0a6 - - default default] Running cmd (subprocess): env LANG=C uptime execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:384#033[00m
Dec 11 01:30:36 np0005554845 nova_compute[187128]: 2025-12-11 06:30:36.653 187132 DEBUG oslo_concurrency.processutils [None req-317c34a9-b54d-4aa9-9f1f-5efb4dad6e5e 77e5fd8a8d4645f58602da9f89feb3a3 58891547c7294a57a183f092c2e8f0a6 - - default default] CMD "env LANG=C uptime" returned: 0 in 0.041s execute /usr/lib/python3.9/site-packages/oslo_concurrency/processutils.py:422#033[00m
Dec 11 01:30:37 np0005554845 nova_compute[187128]: 2025-12-11 06:30:37.520 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:38 np0005554845 nova_compute[187128]: 2025-12-11 06:30:38.046 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:39 np0005554845 nova_compute[187128]: 2025-12-11 06:30:39.697 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:42.348 104320 DEBUG ovsdbapp.backend.ovs_idl.event [-] Matched UPDATE: SbGlobalUpdateEvent(events=('update',), table='SB_Global', conditions=None, old_conditions=None), priority=20 to row=SB_Global(external_ids={}, nb_cfg=24, options={'arp_ns_explicit_output': 'true', 'mac_prefix': 'a6:cd:08', 'max_tunid': '16711680', 'northd_internal_version': '24.03.8-20.33.0-76.8', 'svc_monitor_mac': '36:c2:b2:f0:cc:9f'}, ipsec=False) old=SB_Global(nb_cfg=23) matches /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/event.py:43#033[00m
Dec 11 01:30:42 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:42.349 104320 DEBUG neutron.agent.ovn.metadata.agent [-] Delaying updating chassis table for 7 seconds run /usr/lib/python3.9/site-packages/neutron/agent/ovn/metadata/agent.py:274#033[00m
Dec 11 01:30:42 np0005554845 nova_compute[187128]: 2025-12-11 06:30:42.350 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:42 np0005554845 nova_compute[187128]: 2025-12-11 06:30:42.521 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:43 np0005554845 nova_compute[187128]: 2025-12-11 06:30:43.048 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.717 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.718 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.885 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.886 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5689MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.887 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.887 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.943 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:30:44 np0005554845 nova_compute[187128]: 2025-12-11 06:30:44.943 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.015 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.034 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.035 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.050 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.085 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.125 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.153 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.155 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:30:45 np0005554845 nova_compute[187128]: 2025-12-11 06:30:45.156 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.269s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.152 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.525 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:30:47 np0005554845 nova_compute[187128]: 2025-12-11 06:30:47.709 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:30:48 np0005554845 nova_compute[187128]: 2025-12-11 06:30:48.091 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:48 np0005554845 podman[232202]: 2025-12-11 06:30:48.189666905 +0000 UTC m=+0.079913846 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:30:48 np0005554845 nova_compute[187128]: 2025-12-11 06:30:48.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:48 np0005554845 nova_compute[187128]: 2025-12-11 06:30:48.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:49 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:30:49.354 104320 DEBUG ovsdbapp.backend.ovs_idl.transaction [-] Running txn n=1 command(idx=0): DbSetCommand(_result=None, table=Chassis_Private, record=3bbd5a39-e9ff-4cd4-b463-1eb8ecef6459, col_values=(('external_ids', {'neutron:ovn-metadata-sb-cfg': '24'}),), if_exists=True) do_commit /usr/lib/python3.9/site-packages/ovsdbapp/backend/ovs_idl/transaction.py:89#033[00m
Dec 11 01:30:49 np0005554845 nova_compute[187128]: 2025-12-11 06:30:49.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:49 np0005554845 nova_compute[187128]: 2025-12-11 06:30:49.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:52 np0005554845 nova_compute[187128]: 2025-12-11 06:30:52.527 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:53 np0005554845 nova_compute[187128]: 2025-12-11 06:30:53.094 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:54 np0005554845 podman[232226]: 2025-12-11 06:30:54.135996237 +0000 UTC m=+0.065069957 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_metadata_agent, tcib_managed=true, container_name=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 01:30:54 np0005554845 podman[232228]: 2025-12-11 06:30:54.175643517 +0000 UTC m=+0.085293857 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, tcib_managed=true, container_name=ceilometer_agent_compute, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:30:54 np0005554845 podman[232227]: 2025-12-11 06:30:54.191288537 +0000 UTC m=+0.106099693 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 01:30:54 np0005554845 nova_compute[187128]: 2025-12-11 06:30:54.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:55 np0005554845 nova_compute[187128]: 2025-12-11 06:30:55.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:30:55 np0005554845 nova_compute[187128]: 2025-12-11 06:30:55.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:30:57 np0005554845 nova_compute[187128]: 2025-12-11 06:30:57.529 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:30:58 np0005554845 nova_compute[187128]: 2025-12-11 06:30:58.123 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:01 np0005554845 podman[232289]: 2025-12-11 06:31:01.14427566 +0000 UTC m=+0.075927163 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=multipathd, container_name=multipathd)
Dec 11 01:31:02 np0005554845 nova_compute[187128]: 2025-12-11 06:31:02.587 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:03 np0005554845 nova_compute[187128]: 2025-12-11 06:31:03.125 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:04 np0005554845 podman[232310]: 2025-12-11 06:31:04.160254397 +0000 UTC m=+0.082664739 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:31:04 np0005554845 podman[232311]: 2025-12-11 06:31:04.173858033 +0000 UTC m=+0.091313206 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, io.buildah.version=1.33.7, version=9.6, managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, maintainer=Red Hat, Inc., vcs-type=git, vendor=Red Hat, Inc., name=ubi9-minimal, com.redhat.component=ubi9-minimal-container, io.openshift.expose-services=, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm)
Dec 11 01:31:07 np0005554845 nova_compute[187128]: 2025-12-11 06:31:07.602 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:08 np0005554845 nova_compute[187128]: 2025-12-11 06:31:08.177 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:12 np0005554845 nova_compute[187128]: 2025-12-11 06:31:12.611 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:13 np0005554845 nova_compute[187128]: 2025-12-11 06:31:13.180 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:17 np0005554845 nova_compute[187128]: 2025-12-11 06:31:17.615 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:18 np0005554845 nova_compute[187128]: 2025-12-11 06:31:18.182 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:19 np0005554845 podman[232355]: 2025-12-11 06:31:19.130930146 +0000 UTC m=+0.068136488 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>)
Dec 11 01:31:22 np0005554845 nova_compute[187128]: 2025-12-11 06:31:22.617 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:23 np0005554845 nova_compute[187128]: 2025-12-11 06:31:23.184 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:25 np0005554845 podman[232379]: 2025-12-11 06:31:25.147839658 +0000 UTC m=+0.069622976 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true, io.buildah.version=1.41.3)
Dec 11 01:31:25 np0005554845 podman[232381]: 2025-12-11 06:31:25.159925865 +0000 UTC m=+0.078137750 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:31:25 np0005554845 podman[232380]: 2025-12-11 06:31:25.195335613 +0000 UTC m=+0.110696663 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=ovn_controller, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible)
Dec 11 01:31:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:31:26.248 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:31:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:31:26.248 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:31:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:31:26.248 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:31:27 np0005554845 nova_compute[187128]: 2025-12-11 06:31:27.621 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:28 np0005554845 nova_compute[187128]: 2025-12-11 06:31:28.186 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:31:30.108 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:31:32 np0005554845 podman[232440]: 2025-12-11 06:31:32.122593971 +0000 UTC m=+0.056534204 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, container_name=multipathd, org.label-schema.license=GPLv2, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, config_id=multipathd, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202)
Dec 11 01:31:32 np0005554845 nova_compute[187128]: 2025-12-11 06:31:32.624 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:33 np0005554845 nova_compute[187128]: 2025-12-11 06:31:33.188 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:35 np0005554845 podman[232460]: 2025-12-11 06:31:35.160562497 +0000 UTC m=+0.083844929 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:31:35 np0005554845 podman[232461]: 2025-12-11 06:31:35.175544309 +0000 UTC m=+0.091960421 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, version=9.6, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, maintainer=Red Hat, Inc., vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, config_id=edpm, name=ubi9-minimal, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.buildah.version=1.33.7, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, managed_by=edpm_ansible, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=)
Dec 11 01:31:37 np0005554845 nova_compute[187128]: 2025-12-11 06:31:37.625 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:38 np0005554845 nova_compute[187128]: 2025-12-11 06:31:38.190 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:39 np0005554845 nova_compute[187128]: 2025-12-11 06:31:39.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:42 np0005554845 nova_compute[187128]: 2025-12-11 06:31:42.627 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:43 np0005554845 nova_compute[187128]: 2025-12-11 06:31:43.191 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.730 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.731 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.731 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.731 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.923 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.925 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5691MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.926 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:31:45 np0005554845 nova_compute[187128]: 2025-12-11 06:31:45.926 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.042 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.043 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.068 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.084 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.085 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:31:46 np0005554845 nova_compute[187128]: 2025-12-11 06:31:46.085 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.159s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:31:47 np0005554845 nova_compute[187128]: 2025-12-11 06:31:47.629 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:48 np0005554845 nova_compute[187128]: 2025-12-11 06:31:48.080 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:48 np0005554845 nova_compute[187128]: 2025-12-11 06:31:48.244 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.705 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.706 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:49 np0005554845 nova_compute[187128]: 2025-12-11 06:31:49.707 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:50 np0005554845 podman[232504]: 2025-12-11 06:31:50.143860597 +0000 UTC m=+0.074654588 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:31:50 np0005554845 nova_compute[187128]: 2025-12-11 06:31:50.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:52 np0005554845 nova_compute[187128]: 2025-12-11 06:31:52.665 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:53 np0005554845 nova_compute[187128]: 2025-12-11 06:31:53.247 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:56 np0005554845 podman[232530]: 2025-12-11 06:31:56.148916669 +0000 UTC m=+0.066744521 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ceilometer_agent_compute, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:31:56 np0005554845 podman[232528]: 2025-12-11 06:31:56.165863123 +0000 UTC m=+0.081000355 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:31:56 np0005554845 podman[232529]: 2025-12-11 06:31:56.18179696 +0000 UTC m=+0.101129312 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, container_name=ovn_controller, managed_by=edpm_ansible, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.vendor=CentOS)
Dec 11 01:31:56 np0005554845 nova_compute[187128]: 2025-12-11 06:31:56.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:31:56 np0005554845 nova_compute[187128]: 2025-12-11 06:31:56.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:31:57 np0005554845 nova_compute[187128]: 2025-12-11 06:31:57.667 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:31:58 np0005554845 nova_compute[187128]: 2025-12-11 06:31:58.250 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:02 np0005554845 nova_compute[187128]: 2025-12-11 06:32:02.669 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:03 np0005554845 podman[232590]: 2025-12-11 06:32:03.138342177 +0000 UTC m=+0.066068174 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:32:03 np0005554845 nova_compute[187128]: 2025-12-11 06:32:03.289 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:06 np0005554845 podman[232613]: 2025-12-11 06:32:06.14880019 +0000 UTC m=+0.075272044 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, architecture=x86_64, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, vcs-type=git, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, release=1755695350, container_name=openstack_network_exporter, build-date=2025-08-20T13:12:41, version=9.6, io.openshift.tags=minimal rhel9, io.buildah.version=1.33.7, config_id=edpm, name=ubi9-minimal, vendor=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:32:06 np0005554845 podman[232612]: 2025-12-11 06:32:06.167808128 +0000 UTC m=+0.089928618 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:32:07 np0005554845 nova_compute[187128]: 2025-12-11 06:32:07.672 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:08 np0005554845 nova_compute[187128]: 2025-12-11 06:32:08.290 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:12 np0005554845 nova_compute[187128]: 2025-12-11 06:32:12.674 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:13 np0005554845 nova_compute[187128]: 2025-12-11 06:32:13.291 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:17 np0005554845 nova_compute[187128]: 2025-12-11 06:32:17.676 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:18 np0005554845 nova_compute[187128]: 2025-12-11 06:32:18.293 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:21 np0005554845 podman[232654]: 2025-12-11 06:32:21.175581501 +0000 UTC m=+0.092115186 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:32:22 np0005554845 nova_compute[187128]: 2025-12-11 06:32:22.679 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:23 np0005554845 nova_compute[187128]: 2025-12-11 06:32:23.326 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:32:26.249 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:32:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:32:26.249 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:32:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:32:26.250 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:32:27 np0005554845 podman[232679]: 2025-12-11 06:32:27.133336632 +0000 UTC m=+0.059614964 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, container_name=ovn_metadata_agent)
Dec 11 01:32:27 np0005554845 podman[232680]: 2025-12-11 06:32:27.15003451 +0000 UTC m=+0.084630170 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.vendor=CentOS, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true)
Dec 11 01:32:27 np0005554845 podman[232681]: 2025-12-11 06:32:27.161159861 +0000 UTC m=+0.089105808 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2)
Dec 11 01:32:27 np0005554845 nova_compute[187128]: 2025-12-11 06:32:27.681 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:28 np0005554845 nova_compute[187128]: 2025-12-11 06:32:28.327 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:32 np0005554845 nova_compute[187128]: 2025-12-11 06:32:32.683 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:33 np0005554845 nova_compute[187128]: 2025-12-11 06:32:33.329 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:34 np0005554845 podman[232740]: 2025-12-11 06:32:34.145246909 +0000 UTC m=+0.074920045 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, config_id=multipathd)
Dec 11 01:32:37 np0005554845 podman[232762]: 2025-12-11 06:32:37.128093109 +0000 UTC m=+0.054103419 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, distribution-scope=public, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, io.openshift.tags=minimal rhel9, com.redhat.component=ubi9-minimal-container, name=ubi9-minimal, container_name=openstack_network_exporter, url=https://catalog.redhat.com/en/search?searchType=containers, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible, architecture=x86_64, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., maintainer=Red Hat, Inc., release=1755695350, vendor=Red Hat, Inc., io.openshift.expose-services=, version=9.6, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, build-date=2025-08-20T13:12:41, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI)
Dec 11 01:32:37 np0005554845 podman[232761]: 2025-12-11 06:32:37.159625416 +0000 UTC m=+0.087923137 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:32:37 np0005554845 nova_compute[187128]: 2025-12-11 06:32:37.685 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:38 np0005554845 nova_compute[187128]: 2025-12-11 06:32:38.360 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:39 np0005554845 nova_compute[187128]: 2025-12-11 06:32:39.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:42 np0005554845 nova_compute[187128]: 2025-12-11 06:32:42.687 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:43 np0005554845 nova_compute[187128]: 2025-12-11 06:32:43.362 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:46 np0005554845 nova_compute[187128]: 2025-12-11 06:32:46.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:46 np0005554845 nova_compute[187128]: 2025-12-11 06:32:46.784 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:32:46 np0005554845 nova_compute[187128]: 2025-12-11 06:32:46.785 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:32:46 np0005554845 nova_compute[187128]: 2025-12-11 06:32:46.786 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:32:46 np0005554845 nova_compute[187128]: 2025-12-11 06:32:46.786 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.042 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.044 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5687MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.044 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.045 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.198 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.199 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.409 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.595 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.598 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.598 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.553s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:32:47 np0005554845 nova_compute[187128]: 2025-12-11 06:32:47.689 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:48 np0005554845 nova_compute[187128]: 2025-12-11 06:32:48.364 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:50 np0005554845 nova_compute[187128]: 2025-12-11 06:32:50.593 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:50 np0005554845 nova_compute[187128]: 2025-12-11 06:32:50.593 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:50 np0005554845 nova_compute[187128]: 2025-12-11 06:32:50.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.711 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.711 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:51 np0005554845 nova_compute[187128]: 2025-12-11 06:32:51.712 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:52 np0005554845 podman[232807]: 2025-12-11 06:32:52.13358771 +0000 UTC m=+0.060634541 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:32:52 np0005554845 nova_compute[187128]: 2025-12-11 06:32:52.690 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:53 np0005554845 nova_compute[187128]: 2025-12-11 06:32:53.366 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:57 np0005554845 nova_compute[187128]: 2025-12-11 06:32:57.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:32:57 np0005554845 nova_compute[187128]: 2025-12-11 06:32:57.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:32:57 np0005554845 nova_compute[187128]: 2025-12-11 06:32:57.693 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:58 np0005554845 podman[232833]: 2025-12-11 06:32:58.155431252 +0000 UTC m=+0.073535520 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:32:58 np0005554845 podman[232831]: 2025-12-11 06:32:58.178010433 +0000 UTC m=+0.097392134 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent)
Dec 11 01:32:58 np0005554845 podman[232832]: 2025-12-11 06:32:58.236743283 +0000 UTC m=+0.150572749 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, container_name=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, tcib_managed=true)
Dec 11 01:32:58 np0005554845 nova_compute[187128]: 2025-12-11 06:32:58.368 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:32:59 np0005554845 nova_compute[187128]: 2025-12-11 06:32:59.689 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:02 np0005554845 nova_compute[187128]: 2025-12-11 06:33:02.720 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:03 np0005554845 nova_compute[187128]: 2025-12-11 06:33:03.369 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:05 np0005554845 podman[232898]: 2025-12-11 06:33:05.168826174 +0000 UTC m=+0.087280610 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:33:07 np0005554845 nova_compute[187128]: 2025-12-11 06:33:07.722 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:08 np0005554845 podman[232920]: 2025-12-11 06:33:08.133215994 +0000 UTC m=+0.053159028 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, build-date=2025-08-20T13:12:41, distribution-scope=public, com.redhat.component=ubi9-minimal-container, release=1755695350, io.openshift.expose-services=, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vendor=Red Hat, Inc., io.openshift.tags=minimal rhel9, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, architecture=x86_64, config_id=edpm, io.buildah.version=1.33.7, vcs-type=git, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., version=9.6, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., managed_by=edpm_ansible)
Dec 11 01:33:08 np0005554845 podman[232919]: 2025-12-11 06:33:08.134357375 +0000 UTC m=+0.054349150 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:33:08 np0005554845 nova_compute[187128]: 2025-12-11 06:33:08.437 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:12 np0005554845 nova_compute[187128]: 2025-12-11 06:33:12.724 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:13 np0005554845 nova_compute[187128]: 2025-12-11 06:33:13.439 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:17 np0005554845 nova_compute[187128]: 2025-12-11 06:33:17.726 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:18 np0005554845 nova_compute[187128]: 2025-12-11 06:33:18.440 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:22 np0005554845 nova_compute[187128]: 2025-12-11 06:33:22.728 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:23 np0005554845 podman[232962]: 2025-12-11 06:33:23.162871849 +0000 UTC m=+0.080983190 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:33:23 np0005554845 nova_compute[187128]: 2025-12-11 06:33:23.442 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:33:26.249 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:33:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:33:26.250 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:33:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:33:26.250 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:33:27 np0005554845 nova_compute[187128]: 2025-12-11 06:33:27.730 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:28 np0005554845 nova_compute[187128]: 2025-12-11 06:33:28.442 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:29 np0005554845 podman[232988]: 2025-12-11 06:33:29.153579253 +0000 UTC m=+0.075246264 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, container_name=ceilometer_agent_compute, org.label-schema.build-date=20251202, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=edpm, io.buildah.version=1.41.3, org.label-schema.license=GPLv2)
Dec 11 01:33:29 np0005554845 podman[232986]: 2025-12-11 06:33:29.160404638 +0000 UTC m=+0.080731853 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=ovn_metadata_agent, container_name=ovn_metadata_agent, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']})
Dec 11 01:33:29 np0005554845 podman[232987]: 2025-12-11 06:33:29.198342163 +0000 UTC m=+0.116541310 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, managed_by=edpm_ansible, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.103 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.104 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:33:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:33:32 np0005554845 nova_compute[187128]: 2025-12-11 06:33:32.770 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:33 np0005554845 nova_compute[187128]: 2025-12-11 06:33:33.444 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:36 np0005554845 podman[233049]: 2025-12-11 06:33:36.190649481 +0000 UTC m=+0.117893676 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, container_name=multipathd, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:33:37 np0005554845 nova_compute[187128]: 2025-12-11 06:33:37.773 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:38 np0005554845 nova_compute[187128]: 2025-12-11 06:33:38.514 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:39 np0005554845 podman[233070]: 2025-12-11 06:33:39.139572324 +0000 UTC m=+0.062999804 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:33:39 np0005554845 podman[233071]: 2025-12-11 06:33:39.144129806 +0000 UTC m=+0.064538604 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, com.redhat.component=ubi9-minimal-container, architecture=x86_64, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, config_id=edpm, managed_by=edpm_ansible, distribution-scope=public, io.openshift.expose-services=, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, vendor=Red Hat, Inc., build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, version=9.6, io.buildah.version=1.33.7, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., maintainer=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.openshift.tags=minimal rhel9, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b)
Dec 11 01:33:40 np0005554845 nova_compute[187128]: 2025-12-11 06:33:40.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:42 np0005554845 nova_compute[187128]: 2025-12-11 06:33:42.777 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:43 np0005554845 nova_compute[187128]: 2025-12-11 06:33:43.518 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:47 np0005554845 nova_compute[187128]: 2025-12-11 06:33:47.780 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.520 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.718 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.719 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.720 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.720 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.882 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.883 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5688MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.883 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.883 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.989 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:33:48 np0005554845 nova_compute[187128]: 2025-12-11 06:33:48.990 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:33:49 np0005554845 nova_compute[187128]: 2025-12-11 06:33:49.012 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:33:49 np0005554845 nova_compute[187128]: 2025-12-11 06:33:49.024 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:33:49 np0005554845 nova_compute[187128]: 2025-12-11 06:33:49.027 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:33:49 np0005554845 nova_compute[187128]: 2025-12-11 06:33:49.027 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.144s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:33:50 np0005554845 nova_compute[187128]: 2025-12-11 06:33:50.023 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:50 np0005554845 nova_compute[187128]: 2025-12-11 06:33:50.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:51 np0005554845 nova_compute[187128]: 2025-12-11 06:33:51.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:51 np0005554845 nova_compute[187128]: 2025-12-11 06:33:51.694 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:52 np0005554845 nova_compute[187128]: 2025-12-11 06:33:52.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:52 np0005554845 nova_compute[187128]: 2025-12-11 06:33:52.780 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:53 np0005554845 nova_compute[187128]: 2025-12-11 06:33:53.572 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:53 np0005554845 nova_compute[187128]: 2025-12-11 06:33:53.693 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:53 np0005554845 nova_compute[187128]: 2025-12-11 06:33:53.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:33:53 np0005554845 nova_compute[187128]: 2025-12-11 06:33:53.694 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:33:53 np0005554845 nova_compute[187128]: 2025-12-11 06:33:53.815 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:33:54 np0005554845 podman[233117]: 2025-12-11 06:33:54.142160386 +0000 UTC m=+0.069930141 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:33:57 np0005554845 nova_compute[187128]: 2025-12-11 06:33:57.782 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:58 np0005554845 nova_compute[187128]: 2025-12-11 06:33:58.574 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:33:59 np0005554845 nova_compute[187128]: 2025-12-11 06:33:59.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:33:59 np0005554845 nova_compute[187128]: 2025-12-11 06:33:59.691 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:34:00 np0005554845 podman[233142]: 2025-12-11 06:34:00.149690255 +0000 UTC m=+0.074374581 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, tcib_managed=true, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ovn_metadata_agent, managed_by=edpm_ansible, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, config_id=ovn_metadata_agent)
Dec 11 01:34:00 np0005554845 podman[233144]: 2025-12-11 06:34:00.196819948 +0000 UTC m=+0.112228193 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=edpm, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:34:00 np0005554845 podman[233143]: 2025-12-11 06:34:00.204468925 +0000 UTC m=+0.128532174 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, org.label-schema.vendor=CentOS, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, container_name=ovn_controller, io.buildah.version=1.41.3)
Dec 11 01:34:02 np0005554845 nova_compute[187128]: 2025-12-11 06:34:02.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._run_pending_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:02 np0005554845 nova_compute[187128]: 2025-12-11 06:34:02.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11145#033[00m
Dec 11 01:34:02 np0005554845 nova_compute[187128]: 2025-12-11 06:34:02.713 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] There are 0 instances to clean _run_pending_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:11154#033[00m
Dec 11 01:34:02 np0005554845 nova_compute[187128]: 2025-12-11 06:34:02.796 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:03 np0005554845 nova_compute[187128]: 2025-12-11 06:34:03.577 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:06 np0005554845 nova_compute[187128]: 2025-12-11 06:34:06.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_incomplete_migrations run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:06 np0005554845 nova_compute[187128]: 2025-12-11 06:34:06.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Cleaning up deleted instances with incomplete migration  _cleanup_incomplete_migrations /usr/lib/python3.9/site-packages/nova/compute/manager.py:11183#033[00m
Dec 11 01:34:07 np0005554845 podman[233205]: 2025-12-11 06:34:07.156587128 +0000 UTC m=+0.077233137 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, container_name=multipathd, io.buildah.version=1.41.3, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=multipathd, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.vendor=CentOS, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:34:07 np0005554845 nova_compute[187128]: 2025-12-11 06:34:07.798 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:08 np0005554845 nova_compute[187128]: 2025-12-11 06:34:08.578 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:10 np0005554845 podman[233225]: 2025-12-11 06:34:10.156969762 +0000 UTC m=+0.072321846 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:34:10 np0005554845 podman[233226]: 2025-12-11 06:34:10.179122181 +0000 UTC m=+0.078864343 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, name=ubi9-minimal, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., vendor=Red Hat, Inc., com.redhat.component=ubi9-minimal-container, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, architecture=x86_64, build-date=2025-08-20T13:12:41, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, io.openshift.expose-services=, io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, vcs-type=git, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, managed_by=edpm_ansible, config_id=edpm, distribution-scope=public, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly.)
Dec 11 01:34:12 np0005554845 nova_compute[187128]: 2025-12-11 06:34:12.799 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:13 np0005554845 nova_compute[187128]: 2025-12-11 06:34:13.580 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:16 np0005554845 nova_compute[187128]: 2025-12-11 06:34:16.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._cleanup_expired_console_auth_tokens run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:17 np0005554845 nova_compute[187128]: 2025-12-11 06:34:17.802 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:18 np0005554845 nova_compute[187128]: 2025-12-11 06:34:18.584 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:22 np0005554845 nova_compute[187128]: 2025-12-11 06:34:22.804 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:23 np0005554845 nova_compute[187128]: 2025-12-11 06:34:23.586 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:25 np0005554845 podman[233270]: 2025-12-11 06:34:25.158443964 +0000 UTC m=+0.072658954 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']})
Dec 11 01:34:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:34:26.250 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:34:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:34:26.251 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:34:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:34:26.252 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:34:27 np0005554845 nova_compute[187128]: 2025-12-11 06:34:27.807 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:28 np0005554845 nova_compute[187128]: 2025-12-11 06:34:28.588 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:31 np0005554845 podman[233294]: 2025-12-11 06:34:31.148364878 +0000 UTC m=+0.073524097 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team)
Dec 11 01:34:31 np0005554845 podman[233296]: 2025-12-11 06:34:31.159209981 +0000 UTC m=+0.065760768 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, config_id=edpm, container_name=ceilometer_agent_compute, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0, managed_by=edpm_ansible)
Dec 11 01:34:31 np0005554845 podman[233295]: 2025-12-11 06:34:31.189093278 +0000 UTC m=+0.100071884 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_id=ovn_controller, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.schema-version=1.0, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, io.buildah.version=1.41.3, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']})
Dec 11 01:34:32 np0005554845 nova_compute[187128]: 2025-12-11 06:34:32.808 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:33 np0005554845 nova_compute[187128]: 2025-12-11 06:34:33.588 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:37 np0005554845 nova_compute[187128]: 2025-12-11 06:34:37.811 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:38 np0005554845 podman[233355]: 2025-12-11 06:34:38.139847025 +0000 UTC m=+0.069291564 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_managed=true, config_id=multipathd, container_name=multipathd, managed_by=edpm_ansible, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb)
Dec 11 01:34:38 np0005554845 nova_compute[187128]: 2025-12-11 06:34:38.590 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:40 np0005554845 nova_compute[187128]: 2025-12-11 06:34:40.708 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:41 np0005554845 podman[233375]: 2025-12-11 06:34:41.174188125 +0000 UTC m=+0.088509773 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible)
Dec 11 01:34:41 np0005554845 podman[233376]: 2025-12-11 06:34:41.185821339 +0000 UTC m=+0.093628451 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, release=1755695350, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.openshift.tags=minimal rhel9, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9., com.redhat.component=ubi9-minimal-container, vendor=Red Hat, Inc., url=https://catalog.redhat.com/en/search?searchType=containers, architecture=x86_64, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, vcs-type=git, build-date=2025-08-20T13:12:41, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, name=ubi9-minimal, version=9.6, container_name=openstack_network_exporter, maintainer=Red Hat, Inc., managed_by=edpm_ansible, config_id=edpm, distribution-scope=public)
Dec 11 01:34:42 np0005554845 nova_compute[187128]: 2025-12-11 06:34:42.813 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:43 np0005554845 nova_compute[187128]: 2025-12-11 06:34:43.593 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:47 np0005554845 nova_compute[187128]: 2025-12-11 06:34:47.815 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:48 np0005554845 nova_compute[187128]: 2025-12-11 06:34:48.598 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.716 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.717 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.717 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.717 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.895 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.896 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5701MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.897 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:34:49 np0005554845 nova_compute[187128]: 2025-12-11 06:34:49.897 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.004 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.005 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.029 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.045 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.047 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:34:50 np0005554845 nova_compute[187128]: 2025-12-11 06:34:50.048 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.151s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:34:51 np0005554845 nova_compute[187128]: 2025-12-11 06:34:51.045 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:51 np0005554845 nova_compute[187128]: 2025-12-11 06:34:51.045 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:51 np0005554845 nova_compute[187128]: 2025-12-11 06:34:51.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:52 np0005554845 nova_compute[187128]: 2025-12-11 06:34:52.816 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:53 np0005554845 nova_compute[187128]: 2025-12-11 06:34:53.599 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:53 np0005554845 nova_compute[187128]: 2025-12-11 06:34:53.690 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:53 np0005554845 nova_compute[187128]: 2025-12-11 06:34:53.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:55 np0005554845 nova_compute[187128]: 2025-12-11 06:34:55.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:34:55 np0005554845 nova_compute[187128]: 2025-12-11 06:34:55.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:34:55 np0005554845 nova_compute[187128]: 2025-12-11 06:34:55.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:34:55 np0005554845 nova_compute[187128]: 2025-12-11 06:34:55.720 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:34:56 np0005554845 podman[233418]: 2025-12-11 06:34:56.129587783 +0000 UTC m=+0.062770779 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:34:57 np0005554845 nova_compute[187128]: 2025-12-11 06:34:57.818 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:34:58 np0005554845 nova_compute[187128]: 2025-12-11 06:34:58.603 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:01 np0005554845 nova_compute[187128]: 2025-12-11 06:35:01.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:01 np0005554845 nova_compute[187128]: 2025-12-11 06:35:01.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:35:02 np0005554845 podman[233444]: 2025-12-11 06:35:02.165437706 +0000 UTC m=+0.083905348 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.vendor=CentOS, org.label-schema.name=CentOS Stream 9 Base Image, tcib_managed=true, config_id=edpm, org.label-schema.schema-version=1.0)
Dec 11 01:35:02 np0005554845 podman[233443]: 2025-12-11 06:35:02.291879842 +0000 UTC m=+0.214092436 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, org.label-schema.build-date=20251202, container_name=ovn_controller, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_managed=true, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_controller, managed_by=edpm_ansible, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:35:02 np0005554845 podman[233442]: 2025-12-11 06:35:02.300812154 +0000 UTC m=+0.224075445 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.build-date=20251202, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, config_id=ovn_metadata_agent, org.label-schema.schema-version=1.0)
Dec 11 01:35:02 np0005554845 nova_compute[187128]: 2025-12-11 06:35:02.687 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:02 np0005554845 nova_compute[187128]: 2025-12-11 06:35:02.821 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:03 np0005554845 nova_compute[187128]: 2025-12-11 06:35:03.603 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:07 np0005554845 nova_compute[187128]: 2025-12-11 06:35:07.822 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:08 np0005554845 nova_compute[187128]: 2025-12-11 06:35:08.606 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:09 np0005554845 podman[233504]: 2025-12-11 06:35:09.163777568 +0000 UTC m=+0.090684571 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, managed_by=edpm_ansible, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, io.buildah.version=1.41.3, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0)
Dec 11 01:35:12 np0005554845 podman[233524]: 2025-12-11 06:35:12.164034038 +0000 UTC m=+0.077145236 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_id=edpm, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']})
Dec 11 01:35:12 np0005554845 podman[233525]: 2025-12-11 06:35:12.16779389 +0000 UTC m=+0.080714063 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, release=1755695350, url=https://catalog.redhat.com/en/search?searchType=containers, distribution-scope=public, io.openshift.tags=minimal rhel9, vcs-type=git, version=9.6, com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, maintainer=Red Hat, Inc., vendor=Red Hat, Inc., architecture=x86_64, io.buildah.version=1.33.7, io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, managed_by=edpm_ansible, build-date=2025-08-20T13:12:41, container_name=openstack_network_exporter, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., name=ubi9-minimal, config_id=edpm, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.expose-services=, com.redhat.component=ubi9-minimal-container, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 01:35:12 np0005554845 nova_compute[187128]: 2025-12-11 06:35:12.825 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:13 np0005554845 nova_compute[187128]: 2025-12-11 06:35:13.609 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:17 np0005554845 nova_compute[187128]: 2025-12-11 06:35:17.827 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:18 np0005554845 nova_compute[187128]: 2025-12-11 06:35:18.649 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:22 np0005554845 nova_compute[187128]: 2025-12-11 06:35:22.830 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:23 np0005554845 nova_compute[187128]: 2025-12-11 06:35:23.651 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:35:26.251 104320 DEBUG oslo_concurrency.lockutils [-] Acquiring lock "_check_child_processes" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:35:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:35:26.252 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" acquired by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:35:26 np0005554845 ovn_metadata_agent[104315]: 2025-12-11 06:35:26.252 104320 DEBUG oslo_concurrency.lockutils [-] Lock "_check_child_processes" "released" by "neutron.agent.linux.external_process.ProcessMonitor._check_child_processes" :: held 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:35:27 np0005554845 podman[233568]: 2025-12-11 06:35:27.141999738 +0000 UTC m=+0.069882200 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:35:27 np0005554845 nova_compute[187128]: 2025-12-11 06:35:27.832 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:28 np0005554845 nova_compute[187128]: 2025-12-11 06:35:28.651 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster cpu, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.105 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.allocation, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.capacity, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.error, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.incoming.bytes.rate, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster memory.usage, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.latency, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.bytes.delta, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.iops, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.read.bytes, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.106 12 DEBUG ceilometer.polling.manager [-] Skip pollster disk.device.write.requests, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:30 np0005554845 ceilometer_agent_compute[197813]: 2025-12-11 06:35:30.107 12 DEBUG ceilometer.polling.manager [-] Skip pollster network.outgoing.packets.drop, no  resources found this cycle poll_and_notify /usr/lib/python3.9/site-packages/ceilometer/polling/manager.py:193
Dec 11 01:35:32 np0005554845 nova_compute[187128]: 2025-12-11 06:35:32.834 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:33 np0005554845 podman[233593]: 2025-12-11 06:35:33.157517751 +0000 UTC m=+0.079906050 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, tcib_managed=true, config_id=ovn_metadata_agent, io.buildah.version=1.41.3, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:35:33 np0005554845 podman[233595]: 2025-12-11 06:35:33.199385273 +0000 UTC m=+0.110543628 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, tcib_managed=true, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image, config_id=edpm, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, container_name=ceilometer_agent_compute, managed_by=edpm_ansible, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, org.label-schema.schema-version=1.0)
Dec 11 01:35:33 np0005554845 podman[233594]: 2025-12-11 06:35:33.202434235 +0000 UTC m=+0.118665118 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, container_name=ovn_controller, managed_by=edpm_ansible, org.label-schema.license=GPLv2, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, config_id=ovn_controller, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, org.label-schema.vendor=CentOS)
Dec 11 01:35:33 np0005554845 nova_compute[187128]: 2025-12-11 06:35:33.653 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:37 np0005554845 nova_compute[187128]: 2025-12-11 06:35:37.837 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:38 np0005554845 nova_compute[187128]: 2025-12-11 06:35:38.697 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:40 np0005554845 podman[233657]: 2025-12-11 06:35:40.153520389 +0000 UTC m=+0.077537578 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.schema-version=1.0, io.buildah.version=1.41.3, tcib_managed=true, config_id=multipathd, org.label-schema.vendor=CentOS, container_name=multipathd, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, maintainer=OpenStack Kubernetes Operator team, managed_by=edpm_ansible, org.label-schema.build-date=20251202)
Dec 11 01:35:41 np0005554845 nova_compute[187128]: 2025-12-11 06:35:41.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_volume_usage run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:42 np0005554845 nova_compute[187128]: 2025-12-11 06:35:42.838 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:43 np0005554845 podman[233677]: 2025-12-11 06:35:43.153161101 +0000 UTC m=+0.079090948 container health_status 1a706b3d9f6b5e86e7f767adf7d9e505adae373fe5f0529a7b30d09b83c6cbfb (image=quay.io/prometheus/node-exporter:v1.5.0, name=node_exporter, health_status=healthy, health_failing_streak=0, health_log=, container_name=node_exporter, maintainer=The Prometheus Authors <prometheus-developers@googlegroups.com>, managed_by=edpm_ansible, config_data={'image': 'quay.io/prometheus/node-exporter:v1.5.0', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9100:9100'], 'command': ['--web.config.file=/etc/node_exporter/node_exporter.yaml', '--web.disable-exporter-metrics', '--collector.systemd', '--collector.systemd.unit-include=(edpm_.*|ovs.*|openvswitch|virt.*|rsyslog)\\.service', '--no-collector.dmi', '--no-collector.entropy', '--no-collector.thermal_zone', '--no-collector.time', '--no-collector.timex', '--no-collector.uname', '--no-collector.stat', '--no-collector.hwmon', '--no-collector.os', '--no-collector.selinux', '--no-collector.textfile', '--no-collector.powersupplyclass', '--no-collector.pressure', '--no-collector.rapl'], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck node_exporter', 'mount': '/var/lib/openstack/healthchecks/node_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/node_exporter.yaml:/etc/node_exporter/node_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/node_exporter/tls:z', '/var/run/dbus/system_bus_socket:/var/run/dbus/system_bus_socket:rw', '/var/lib/openstack/healthchecks/node_exporter:/openstack:ro,z']}, config_id=edpm)
Dec 11 01:35:43 np0005554845 podman[233678]: 2025-12-11 06:35:43.175564427 +0000 UTC m=+0.085869281 container health_status cf95474c75dfd899edd6259a98af6b30b148b4b70af7cf7c1cd5868fa5c128e5 (image=quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified, name=openstack_network_exporter, health_status=healthy, health_failing_streak=0, health_log=, vcs-type=git, architecture=x86_64, io.openshift.expose-services=, url=https://catalog.redhat.com/en/search?searchType=containers, vcs-ref=f4b088292653bbf5ca8188a5e59ffd06a8671d4b, description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., version=9.6, com.redhat.component=ubi9-minimal-container, io.buildah.version=1.33.7, maintainer=Red Hat, Inc., io.k8s.display-name=Red Hat Universal Base Image 9 Minimal, name=ubi9-minimal, config_data={'image': 'quay.io/openstack-k8s-operators/openstack-network-exporter:current-podified', 'restart': 'always', 'recreate': True, 'privileged': True, 'ports': ['9105:9105'], 'command': [], 'net': 'host', 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'OPENSTACK_NETWORK_EXPORTER_YAML': '/etc/openstack_network_exporter/openstack_network_exporter.yaml'}, 'healthcheck': {'test': '/openstack/healthcheck openstack-netwo', 'mount': '/var/lib/openstack/healthchecks/openstack_network_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/openstack_network_exporter.yaml:/etc/openstack_network_exporter/openstack_network_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/openstack_network_exporter/tls:z', '/var/run/openvswitch:/run/openvswitch:rw,z', '/var/lib/openvswitch/ovn:/run/ovn:rw,z', '/proc:/host/proc:ro', '/var/lib/openstack/healthchecks/openstack_network_exporter:/openstack:ro,z']}, distribution-scope=public, managed_by=edpm_ansible, vendor=Red Hat, Inc., com.redhat.license_terms=https://www.redhat.com/en/about/red-hat-end-user-license-agreements#UBI, io.k8s.description=The Universal Base Image Minimal is a stripped down image that uses microdnf as a package manager. This base image is freely redistributable, but Red Hat only supports Red Hat technologies through subscriptions for Red Hat products. This image is maintained by Red Hat and updated regularly., io.openshift.tags=minimal rhel9, container_name=openstack_network_exporter, config_id=edpm, build-date=2025-08-20T13:12:41, release=1755695350, summary=Provides the latest release of the minimal Red Hat Universal Base Image 9.)
Dec 11 01:35:43 np0005554845 nova_compute[187128]: 2025-12-11 06:35:43.701 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:47 np0005554845 nova_compute[187128]: 2025-12-11 06:35:47.863 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:48 np0005554845 nova_compute[187128]: 2025-12-11 06:35:48.701 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.719 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.720 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: waited 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.720 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker.clean_compute_node_cache" :: held 0.001s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.721 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Auditing locally available compute resources for compute-2.ctlplane.example.com (node: compute-2.ctlplane.example.com) update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:861#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.934 187132 WARNING nova.virt.libvirt.driver [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] This host appears to have multiple sockets per NUMA node. The `socket` PCI NUMA affinity will not be supported.#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.936 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Hypervisor/Node resource view: name=compute-2.ctlplane.example.com free_ram=5696MB free_disk=73.28357315063477GB free_vcpus=8 pci_devices=[{"dev_id": "pci_0000_00_07_0", "address": "0000:00:07.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_03_0", "address": "0000:00:03.0", "product_id": "1000", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_02_0", "address": "0000:00:02.0", "product_id": "1050", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1050", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_2", "address": "0000:00:01.2", "product_id": "7020", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7020", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_04_0", "address": "0000:00:04.0", "product_id": "1001", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1001", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_1", "address": "0000:00:01.1", "product_id": "7010", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7010", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_3", "address": "0000:00:01.3", "product_id": "7113", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7113", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_05_0", "address": "0000:00:05.0", "product_id": "1002", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1002", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_00_0", "address": "0000:00:00.0", "product_id": "1237", "vendor_id": "8086", "numa_node": null, "label": "label_8086_1237", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_01_0", "address": "0000:00:01.0", "product_id": "7000", "vendor_id": "8086", "numa_node": null, "label": "label_8086_7000", "dev_type": "type-PCI"}, {"dev_id": "pci_0000_00_06_0", "address": "0000:00:06.0", "product_id": "1005", "vendor_id": "1af4", "numa_node": null, "label": "label_1af4_1005", "dev_type": "type-PCI"}] _report_hypervisor_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1034#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.936 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Acquiring lock "compute_resources" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:404#033[00m
Dec 11 01:35:49 np0005554845 nova_compute[187128]: 2025-12-11 06:35:49.936 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" acquired by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: waited 0.000s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:409#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.085 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Total usable vcpus: 8, total allocated vcpus: 0 _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1057#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.086 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Final resource view: name=compute-2.ctlplane.example.com phys_ram=7679MB used_ram=512MB phys_disk=79GB used_disk=0GB total_vcpus=8 used_vcpus=0 pci_stats=[] _report_final_resource_view /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:1066#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.100 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing inventories for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9 _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:804#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.114 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating ProviderTree inventory for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 from _refresh_and_get_inventory using data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} _refresh_and_get_inventory /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:768#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.114 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Updating inventory in ProviderTree for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 with inventory: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:176#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.128 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing aggregate associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, aggregates: None _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:813#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.148 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Refreshing trait associations for resource provider eece7817-9d4f-4ebe-96c8-a659f76170f9, traits: COMPUTE_ACCELERATORS,HW_CPU_X86_SSSE3,COMPUTE_STORAGE_BUS_SCSI,COMPUTE_NET_VIF_MODEL_SPAPR_VLAN,COMPUTE_NET_VIF_MODEL_PCNET,HW_CPU_X86_SSE2,COMPUTE_IMAGE_TYPE_ISO,COMPUTE_RESCUE_BFV,COMPUTE_VOLUME_EXTEND,COMPUTE_IMAGE_TYPE_RAW,COMPUTE_DEVICE_TAGGING,COMPUTE_STORAGE_BUS_USB,COMPUTE_STORAGE_BUS_SATA,COMPUTE_STORAGE_BUS_VIRTIO,HW_CPU_X86_MMX,COMPUTE_NODE,COMPUTE_VIOMMU_MODEL_VIRTIO,COMPUTE_GRAPHICS_MODEL_NONE,COMPUTE_NET_VIF_MODEL_E1000,COMPUTE_IMAGE_TYPE_ARI,COMPUTE_TRUSTED_CERTS,HW_CPU_X86_SSE41,COMPUTE_SOCKET_PCI_NUMA_AFFINITY,COMPUTE_NET_ATTACH_INTERFACE,COMPUTE_NET_VIF_MODEL_NE2K_PCI,COMPUTE_VIOMMU_MODEL_INTEL,COMPUTE_GRAPHICS_MODEL_BOCHS,COMPUTE_GRAPHICS_MODEL_VGA,COMPUTE_IMAGE_TYPE_AKI,COMPUTE_STORAGE_BUS_IDE,COMPUTE_IMAGE_TYPE_QCOW2,COMPUTE_STORAGE_BUS_FDC,COMPUTE_SECURITY_UEFI_SECURE_BOOT,HW_CPU_X86_SSE42,COMPUTE_VOLUME_ATTACH_WITH_TAG,COMPUTE_VIOMMU_MODEL_AUTO,COMPUTE_NET_VIF_MODEL_VIRTIO,COMPUTE_NET_VIF_MODEL_VMXNET3,COMPUTE_NET_VIF_MODEL_E1000E,COMPUTE_VOLUME_MULTI_ATTACH,COMPUTE_GRAPHICS_MODEL_CIRRUS,COMPUTE_SECURITY_TPM_1_2,COMPUTE_IMAGE_TYPE_AMI,COMPUTE_NET_VIF_MODEL_RTL8139,COMPUTE_GRAPHICS_MODEL_VIRTIO,COMPUTE_SECURITY_TPM_2_0,HW_CPU_X86_SSE,COMPUTE_NET_ATTACH_INTERFACE_WITH_TAG _refresh_associations /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:825#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.169 187132 DEBUG nova.compute.provider_tree [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed in ProviderTree for provider: eece7817-9d4f-4ebe-96c8-a659f76170f9 update_inventory /usr/lib/python3.9/site-packages/nova/compute/provider_tree.py:180#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.183 187132 DEBUG nova.scheduler.client.report [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Inventory has not changed for provider eece7817-9d4f-4ebe-96c8-a659f76170f9 based on inventory data: {'VCPU': {'total': 8, 'reserved': 0, 'min_unit': 1, 'max_unit': 8, 'step_size': 1, 'allocation_ratio': 4.0}, 'MEMORY_MB': {'total': 7679, 'reserved': 512, 'min_unit': 1, 'max_unit': 7679, 'step_size': 1, 'allocation_ratio': 1.0}, 'DISK_GB': {'total': 79, 'reserved': 1, 'min_unit': 1, 'max_unit': 79, 'step_size': 1, 'allocation_ratio': 0.9}} set_inventory_for_provider /usr/lib/python3.9/site-packages/nova/scheduler/client/report.py:940#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.184 187132 DEBUG nova.compute.resource_tracker [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Compute_service record updated for compute-2.ctlplane.example.com:compute-2.ctlplane.example.com _update_available_resource /usr/lib/python3.9/site-packages/nova/compute/resource_tracker.py:995#033[00m
Dec 11 01:35:50 np0005554845 nova_compute[187128]: 2025-12-11 06:35:50.185 187132 DEBUG oslo_concurrency.lockutils [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Lock "compute_resources" "released" by "nova.compute.resource_tracker.ResourceTracker._update_available_resource" :: held 0.248s inner /usr/lib/python3.9/site-packages/oslo_concurrency/lockutils.py:423#033[00m
Dec 11 01:35:51 np0005554845 nova_compute[187128]: 2025-12-11 06:35:51.180 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._check_instance_build_time run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:51 np0005554845 nova_compute[187128]: 2025-12-11 06:35:51.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rescued_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:51 np0005554845 nova_compute[187128]: 2025-12-11 06:35:51.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:52 np0005554845 nova_compute[187128]: 2025-12-11 06:35:52.863 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:53 np0005554845 nova_compute[187128]: 2025-12-11 06:35:53.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_unconfirmed_resizes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:53 np0005554845 nova_compute[187128]: 2025-12-11 06:35:53.702 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:54 np0005554845 nova_compute[187128]: 2025-12-11 06:35:54.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._poll_rebooting_instances run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:55 np0005554845 nova_compute[187128]: 2025-12-11 06:35:55.692 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:35:55 np0005554845 nova_compute[187128]: 2025-12-11 06:35:55.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9858#033[00m
Dec 11 01:35:55 np0005554845 nova_compute[187128]: 2025-12-11 06:35:55.693 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9862#033[00m
Dec 11 01:35:55 np0005554845 nova_compute[187128]: 2025-12-11 06:35:55.717 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python3.9/site-packages/nova/compute/manager.py:9944#033[00m
Dec 11 01:35:57 np0005554845 nova_compute[187128]: 2025-12-11 06:35:57.865 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:35:58 np0005554845 podman[233722]: 2025-12-11 06:35:58.113964687 +0000 UTC m=+0.048616145 container health_status 4a7249d964dbd9b1c3ab40604a3022e766ddceb32974fae78bf6d3984c79fc1a (image=quay.io/navidys/prometheus-podman-exporter:v1.10.1, name=podman_exporter, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/navidys/prometheus-podman-exporter:v1.10.1', 'restart': 'always', 'recreate': True, 'user': 'root', 'privileged': True, 'ports': ['9882:9882'], 'net': 'host', 'command': ['--web.config.file=/etc/podman_exporter/podman_exporter.yaml'], 'environment': {'OS_ENDPOINT_TYPE': 'internal', 'CONTAINER_HOST': 'unix:///run/podman/podman.sock'}, 'healthcheck': {'test': '/openstack/healthcheck podman_exporter', 'mount': '/var/lib/openstack/healthchecks/podman_exporter'}, 'volumes': ['/var/lib/openstack/config/telemetry/podman_exporter.yaml:/etc/podman_exporter/podman_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/podman_exporter/tls:z', '/run/podman/podman.sock:/run/podman/podman.sock:rw,z', '/var/lib/openstack/healthchecks/podman_exporter:/openstack:ro,z']}, config_id=edpm, container_name=podman_exporter, maintainer=Navid Yaghoobi <navidys@fedoraproject.org>, managed_by=edpm_ansible)
Dec 11 01:35:58 np0005554845 nova_compute[187128]: 2025-12-11 06:35:58.706 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:36:01 np0005554845 systemd-logind[789]: New session 31 of user zuul.
Dec 11 01:36:01 np0005554845 systemd[1]: Started Session 31 of User zuul.
Dec 11 01:36:02 np0005554845 nova_compute[187128]: 2025-12-11 06:36:02.691 187132 DEBUG oslo_service.periodic_task [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python3.9/site-packages/oslo_service/periodic_task.py:210#033[00m
Dec 11 01:36:02 np0005554845 nova_compute[187128]: 2025-12-11 06:36:02.692 187132 DEBUG nova.compute.manager [None req-f3cc6654-2e5c-4abf-8346-04adb7efdcef - - - - - -] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python3.9/site-packages/nova/compute/manager.py:10477#033[00m
Dec 11 01:36:02 np0005554845 nova_compute[187128]: 2025-12-11 06:36:02.867 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:36:03 np0005554845 nova_compute[187128]: 2025-12-11 06:36:03.707 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:36:03 np0005554845 podman[233890]: 2025-12-11 06:36:03.996119218 +0000 UTC m=+0.066116327 container health_status 63b4f8c7576d8527d0d5cb74294ed7af9d38316bfee63f77c6e2cc454c99cca0 (image=quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified, name=ovn_metadata_agent, health_status=healthy, health_failing_streak=0, health_log=, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=ovn_metadata_agent, maintainer=OpenStack Kubernetes Operator team, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, container_name=ovn_metadata_agent, org.label-schema.build-date=20251202, org.label-schema.schema-version=1.0, tcib_managed=true, config_data={'cgroupns': 'host', 'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'EDPM_CONFIG_HASH': '0823bd3e096c75f72e4a95820d41b0d4b6a1172bd2892ddb9f29b788a11bc87d'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_metadata_agent', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-neutron-metadata-agent-ovn:current-podified', 'net': 'host', 'pid': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/run/openvswitch:/run/openvswitch:z', '/var/lib/config-data/ansible-generated/neutron-ovn-metadata-agent:/etc/neutron.conf.d:z', '/run/netns:/run/netns:shared', '/var/lib/kolla/config_files/ovn_metadata_agent.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/neutron:/var/lib/neutron:shared,z', '/var/lib/neutron/ovn_metadata_haproxy_wrapper:/usr/local/bin/haproxy:ro', '/var/lib/neutron/kill_scripts:/etc/neutron/kill_scripts:ro', '/var/lib/openstack/cacerts/neutron-metadata/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/neutron-metadata/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_metadata_agent:/openstack:ro,z']}, io.buildah.version=1.41.3)
Dec 11 01:36:03 np0005554845 podman[233892]: 2025-12-11 06:36:03.998920184 +0000 UTC m=+0.069875810 container health_status f879fadded079a1a03ceebf369949c108f7af12d548274ffb760cb93d328661d (image=quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified, name=ceilometer_agent_compute, health_status=healthy, health_failing_streak=0, health_log=, config_data={'image': 'quay.io/podified-antelope-centos9/openstack-ceilometer-compute:current-podified', 'user': 'ceilometer', 'restart': 'always', 'command': 'kolla_start', 'security_opt': 'label:type:ceilometer_polling_t', 'net': 'host', 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS', 'OS_ENDPOINT_TYPE': 'internal'}, 'healthcheck': {'test': '/openstack/healthcheck compute', 'mount': '/var/lib/openstack/healthchecks/ceilometer_agent_compute'}, 'volumes': ['/var/lib/openstack/config/telemetry:/var/lib/openstack/config/:z', '/var/lib/openstack/config/telemetry/ceilometer-agent-compute.json:/var/lib/kolla/config_files/config.json:z', '/run/libvirt:/run/libvirt:shared,ro', '/etc/hosts:/etc/hosts:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/var/lib/openstack/cacerts/telemetry/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/config/telemetry/ceilometer_prom_exporter.yaml:/etc/ceilometer/ceilometer_prom_exporter.yaml:z', '/var/lib/openstack/certs/telemetry/default:/etc/ceilometer/tls:z', '/dev/log:/dev/log', '/var/lib/openstack/healthchecks/ceilometer_agent_compute:/openstack:ro,z']}, org.label-schema.schema-version=1.0, container_name=ceilometer_agent_compute, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, config_id=edpm, io.buildah.version=1.41.3, managed_by=edpm_ansible, org.label-schema.license=GPLv2, tcib_managed=true)
Dec 11 01:36:04 np0005554845 podman[233891]: 2025-12-11 06:36:04.027198227 +0000 UTC m=+0.097963998 container health_status a9e8acea839839707776b056f8a3d0da5d65c2b5371f9f97870e2a7ad9a9d5b1 (image=quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified, name=ovn_controller, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.schema-version=1.0, config_data={'depends_on': ['openvswitch.service'], 'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/ovn_controller', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-ovn-controller:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'user': 'root', 'volumes': ['/lib/modules:/lib/modules:ro', '/run:/run', '/var/lib/openvswitch/ovn:/run/ovn:shared,z', '/var/lib/kolla/config_files/ovn_controller.json:/var/lib/kolla/config_files/config.json:ro', '/var/lib/openstack/cacerts/ovn/tls-ca-bundle.pem:/etc/pki/ca-trust/extracted/pem/tls-ca-bundle.pem:ro,z', '/var/lib/openstack/certs/ovn/default/ca.crt:/etc/pki/tls/certs/ovndbca.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.crt:/etc/pki/tls/certs/ovndb.crt:ro,z', '/var/lib/openstack/certs/ovn/default/tls.key:/etc/pki/tls/private/ovndb.key:ro,Z', '/var/lib/openstack/healthchecks/ovn_controller:/openstack:ro,z']}, container_name=ovn_controller, managed_by=edpm_ansible, maintainer=OpenStack Kubernetes Operator team, org.label-schema.vendor=CentOS, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_id=ovn_controller, io.buildah.version=1.41.3, org.label-schema.build-date=20251202, org.label-schema.license=GPLv2, org.label-schema.name=CentOS Stream 9 Base Image)
Dec 11 01:36:06 np0005554845 ovs-vsctl[233983]: ovs|00001|db_ctl_base|ERR|no key "dpdk-init" in Open_vSwitch record "." column other_config
Dec 11 01:36:07 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtnetworkd-sock-ro': No such file or directory
Dec 11 01:36:07 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtnwfilterd-sock-ro': No such file or directory
Dec 11 01:36:07 np0005554845 virtqemud[186638]: Failed to connect socket to '/var/run/libvirt/virtstoraged-sock-ro': No such file or directory
Dec 11 01:36:07 np0005554845 nova_compute[187128]: 2025-12-11 06:36:07.869 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:36:08 np0005554845 nova_compute[187128]: 2025-12-11 06:36:08.708 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
Dec 11 01:36:10 np0005554845 systemd[1]: Starting Hostname Service...
Dec 11 01:36:10 np0005554845 podman[234510]: 2025-12-11 06:36:10.935514636 +0000 UTC m=+0.073741553 container health_status eeb3b443ef56256976bec80ccbe3d074bae740509bfd82aa52ecdc2d7f164bec (image=quay.io/podified-antelope-centos9/openstack-multipathd:current-podified, name=multipathd, health_status=healthy, health_failing_streak=0, health_log=, org.label-schema.vendor=CentOS, io.buildah.version=1.41.3, maintainer=OpenStack Kubernetes Operator team, org.label-schema.build-date=20251202, org.label-schema.name=CentOS Stream 9 Base Image, managed_by=edpm_ansible, tcib_build_tag=c3923531bcda0b0811b2d5053f189beb, tcib_managed=true, config_data={'environment': {'KOLLA_CONFIG_STRATEGY': 'COPY_ALWAYS'}, 'healthcheck': {'mount': '/var/lib/openstack/healthchecks/multipathd', 'test': '/openstack/healthcheck'}, 'image': 'quay.io/podified-antelope-centos9/openstack-multipathd:current-podified', 'net': 'host', 'privileged': True, 'restart': 'always', 'volumes': ['/etc/hosts:/etc/hosts:ro', '/etc/localtime:/etc/localtime:ro', '/etc/pki/ca-trust/extracted:/etc/pki/ca-trust/extracted:ro', '/etc/pki/ca-trust/source/anchors:/etc/pki/ca-trust/source/anchors:ro', '/etc/pki/tls/certs/ca-bundle.crt:/etc/pki/tls/certs/ca-bundle.crt:ro', '/etc/pki/tls/certs/ca-bundle.trust.crt:/etc/pki/tls/certs/ca-bundle.trust.crt:ro', '/etc/pki/tls/cert.pem:/etc/pki/tls/cert.pem:ro', '/dev/log:/dev/log', '/var/lib/kolla/config_files/multipathd.json:/var/lib/kolla/config_files/config.json:ro', '/dev:/dev', '/run/udev:/run/udev', '/sys:/sys', '/lib/modules:/lib/modules:ro', '/etc/iscsi:/etc/iscsi:ro', '/var/lib/iscsi:/var/lib/iscsi', '/etc/multipath:/etc/multipath:z', '/etc/multipath.conf:/etc/multipath.conf:ro', '/var/lib/openstack/healthchecks/multipathd:/openstack:ro,z']}, container_name=multipathd, org.label-schema.license=GPLv2, org.label-schema.schema-version=1.0, config_id=multipathd)
Dec 11 01:36:10 np0005554845 systemd[1]: Started Hostname Service.
Dec 11 01:36:12 np0005554845 nova_compute[187128]: 2025-12-11 06:36:12.930 187132 DEBUG ovsdbapp.backend.ovs_idl.vlog [-] [POLLIN] on fd 27 __log_wakeup /usr/lib64/python3.9/site-packages/ovs/poller.py:263#033[00m
